Maxwell reviews starting to appear

When is the last time you owned an SLI setup? I've been running SLI 680s basically since they came out with zero issues (at least that I can remember). nVidia's drivers have come a long way.

agreed, crossfire has been working well for a while as well
 
If they are indeed 175W TDP I'm expecting to be able to SLI them on my 460W PSU.
Indeed. I could go quad-SLI on my 850W PSU with the 980 if I had the motherboard for it. It would be just a little more power draw with 4 980s compared to running 2 780s right now.
 
Average frame rates only and no noise testing in this article make it nearly worthless.

I really don't understand why you guys bother.
 
people been saying this for years, I don't buy it

Resolutions will just get bigger, post processing more intense

by the time VEGA is running 3@ 4K eyefinity/surround at 120Hz with full lightboost the next big thing will be on the horizon...and the big half kiolwatt GPU setups will be there to push it

It's the grand order of things. Look at GPU's and CPU's of today. They pull less power, are on a smaller architecture, run cooler and still do more and drive higher resoltuions than those of years past. My 5nm gpus will drive 3x 4k 120hz monitors better than my 28nm Maxwell cards. Believe it. :)
 
I am going to buy TWO of these 980 badboys and SLI them for my 60inch 1080p plasma. Then I am going to log onto BF4 crank up the supersampling AA. I still won't be able to aim because I suck.

As long as 4GB is enough video memory for the next two years at 1080p this card is awesome.

One question, they mentioned a feature regarding upsampling at 1080p resolution. I wonder how much video memory that will use. I hope 4GB is enough. That is the ONLY issue here for me.

By the time I get into 4k there will be a whole new generation of cards available. Once you go Plasma you can't go back.
 
An unmodified reference-model 780 throttles badly in Furmark, resulting in lower clocks, lower power consumption, lower temps, and lower fan speeds (and, of course, lower performance)

What you're probably seeing is the 780 over-throttling to protect itself, and the 980 continuing to run full-speed thanks to its reduced power consumption and more-efficient design.

Possibly, we'll see.
 
It's the grand order of things. Look at GPU's and CPU's of today. They pull less power, are on a smaller architecture, run cooler and still do more and drive higher resoltuions than those of years past. My 5nm gpus will drive 3x 4k 120hz monitors better than my 28nm Maxwell cards. Believe it. :)

Of course, my point is that we will ALWAYS be asking more of them.

Also, we are soon going to run up against atomic scale limitations on process shrinkage.
 
Of course, my point is that we will ALWAYS be asking more of them.

Also, we are soon going to run up against atomic scale limitations on process shrinkage.

That is true, it looks like silicone is coming to a demise soon and we are moving to better processes.

I guess my point is things will always move forward and the more they are able to refine processes and control leakage less and less power will be required to do the same or more graphically. Maybe (and hopefully) the days we need a 250 Watt TDP GPU to do what we need are gone.
 
Also, we are soon going to run up against atomic scale limitations on process shrinkage.
I wouldn't worry about that too much right now. Intel will hit that wall two to three years before the GPU vendors do, and GPU vendors have several tricks still up their sleeves that aren't dependent on node size. One of the key ones being DRAM stacking. Past that, NVIDIA and AMD still have the opportunity to move toward a tiled or [hybrid] ray-traced architecture, where multi-GPU becomes much more realistic a solution. A typical 2020 video card might have an array of discrete GPUs.

At the end of the day, NVIDIA and AMD will do what they need to do to keep moving performance forward and selling products. If that means the rendering paradigm needs to change, that's what will happen.
 
A typical 2020 video card might have an array of discrete GPUs.

At the end of the day, NVIDIA and AMD will do what they need to do to keep moving performance forward and selling products. If that means the rendering paradigm needs to change, that's what will happen.

and all that will take power

that is all I'm saying
 
Not without information on the fan profile.
Which we have, because the RPM's are listed in the tables in the OP's post...

We know what cooler it is and we know what RPM it's running at in each test. What more do you want? lol
 
Where are fan speeds listed?

EDIT: They're in the power consumption charts. Don't they realize images aren't text-searchable?
 
Where are fan speeds listed?

EDIT: They're in the power consumption charts. Don't they realize images aren't text-searchable?
I just told you, click the link in the OP's post and scroll down...

Here, I copy/pasted it for you. RPM figures are given:

hicivhW.png


Not bad at all for Nvidia's tried-and-true reference cooler. 780 Ti running its fan at 3697 RPM vs. the GTX 980 running its fan at 2787 RPM when being stressed with Furmark (and note that the 780 Ti is throttling badly and STILL overheating, while the 980 appears to be doing just fine)
 
Zero fucks given about power draw unless that means you can OC the tar out of it.
Well, again, Nvidia's current Boost technology monitors temps and power consumption to determine when to IGNORE your overclock and throttle the card.

A more efficient card = less throttling = your overclock actually means something. You should be overjoyed at these power consumption figures if you're overclocking :D

You quoted the edit. Near as I'm aware, today is not Feign Ignorance Day.
I actually quoted you BEFORE your edit. I updated the quote when you updated your post.

Figured I might as well post the image anyway (and what I said wasn't wrong, I DID tell you where to get the information :p ), since there seems to be a general lack of reading comprehension on this forum. If it didn't help you, it might help someone else.
 
With a 175W TDP, people can now SLI on a SFX PSU.
How do you SLI on a mini ITX board with only one PCIe slot? I suppose some people are using SFX PSUs with uATX, but almost always those cases use full size ATX PSUs. SFX is generally used for mini-ITX, no?
When is the last time you owned an SLI setup? I've been running SLI 680s basically since they came out with zero issues (at least that I can remember). nVidia's drivers have come a long way.
Personally? Not since my GTX 295. I swore never again. Although it looks like it's gotten substantially better, the same main problems remain. Lower performance per dollar, more heat, higher power requirements, mediocre scaling in the best case and no scaling at the worst, doesn't work with miniITX, microstutter, can't use video outputs on secondary card, reduced PCIe bandwidth per card, takes up more physical PCIe slots (meaning SLI on uATX deprives you of any free slots for RAID/sound/wireless/PCIe SSD/etc cards), et al.

Don't mistake my meaning. SLI is the best (and only) solution for those that need the absolute maximum graphics performance possible. If that's you, do it. But for the vast majority of people, even on [H], it just doesn't outweigh the drawbacks in my mind. It's also disingenuous to claim multiple cheap cards being much faster than last gen's single card at a similar price as a dramatic improvement. Compare top end single card speeds/prices, that's much more fair.
 
How do you SLI on a mini ITX board with only one PCIe slot? I suppose some people are using SFX PSUs with uATX, but almost always those cases use full size ATX PSUs. SFX is generally used for mini-ITX, no?
There are plenty of MicroATX cases using SFX power supplies, actually.
 
It will. But supplying power to many ICs and to multiple cards is not overly difficult.

I'm not saying that it is, all I have been saying is that the 250ish Watt TDP envelope in the PCIe form factor is not going away any time soon. We will push resolution, frame rate and eye candy such that we will be building fire breathing, power hungry setups for a long time.

I have been reading about the "demise of the discrete GPU" for years ffs...ummmm, no.

Until I am surrounded all four walls, ceiling and floor with display at resolutions and rates my brain can not discern from real life we will be bolting up systems to strive for that.
 
Personally? Not since my GTX 295. I swore never again. Although it looks like it's gotten substantially better said:
I've seen people complaining about microstutter with crossfire in the past, but from what i read even that was taken care of, but not that many complaints about SLI.
Is microstutter really an issue with SLI as of right now? Anybody...? That would be a big no-no for me (planning to run SLI in near future)
 
I'm not saying that it is, all I have been saying is that the 250ish Watt TDP envelope in the PCIe form factor is not going away any time soon. We will push resolution, frame rate and eye candy such that we will be building fire breathing, power hungry setups for a long time.
And I'm suggesting you not think too much about what PCIe enables or doesn't enable. Like traditional rasterization, PCIe will eventually and invariably be replaced with something else, as needs demand. NVIDIA, in fact, already has their own solutions.

I have been reading about the "demise of the discrete GPU" for years ffs...ummmm, no.
I'm not talking about the demise of discrete GPUs, I'm talking about what the inevitability is of many GPUs. When we eventually hit process walls, we'll probably not care. It won't impede performance evolution.
 
And I'm suggesting you not think too much about what PCIe enables or doesn't enable. Like traditional rasterization, PCIe will eventually and invariably be replaced with something else, as needs demand. NVIDIA, in fact, already has their own solutions.
I think he was referring more to the physical size of a graphics card. Much more than 250W in that volume starts to get pretty tricky to cool.
 
What he said up there, the 970 @ $325 if true, is going to be the home-run card. I'm waiting for Titan 900 or whatever it is
 
Well, again, Nvidia's current Boost technology monitors temps and power consumption to determine when to IGNORE your overclock and throttle the card.

A more efficient card = less throttling = your overclock actually means something. You should be overjoyed at these power consumption figures if you're overclocking :D

Don't really care about the boost technology. Just flash the bios to force what you want.
 
It looks like some of those were run in a quiet mode or worse for the 290x card. The 290x was listed as being clocked at 894 or 742Mhz for both power draw and temperature. That performance chart needs to indicate whether the 290x was throttling during game play or if it was given enough airflow to stick at the full 1,000Mhz.

If the 970 and 980 really do price out as leaked, I think the biggest factor will be whether the 980 has unlocked voltage control. If it can be heavily, and reliably, OC'ed like the 580's or AMD 280x cards then the 980 might be worth the price.
 
Another review apparently? Not sure if I'm re-posting but I didn't see it anywhere in the various threads: http://videocardz.com/52552/nvidia-geforce-gtx-980-and-gtx-970-press-slides-pictures-charts


lol I about shit myself at the 7Gbps memory 9.3Gbps "effective". You know somethings up when they don't bother comparing against their last generation and go all the way back to the GTX 680.

Not giving them the excuse of 2 year upgrade cycles more norm than yearly blah blah, lets just point out Nvidia bullshits on their slides just as much as AMD.

Gave me a hearty laugh.
 
Average frame rates only and no noise testing in this article make it nearly worthless.

I really don't understand why you guys bother.

All links to reviews must be cleared through wonderfield before they're allowed to be posted!
 
if it's true about perf / watt, they will have a lot of headroom for a TI + Titan model, I Can see it now, a TI with 10-15% more Cuda cores, a titan with 30% + more memory / bandwidth
 
Don't really care about the boost technology. Just flash the bios to force what you want.
Even after flashing a custom BIOS, my GTX 780 still throttles when running Furmark (and that's with the Power Target at 300% and the temps hovering at only 65c). There's simply no getting around it.

Like I said: "A more efficient card = less throttling = your overclock actually means something. You should be overjoyed at these power consumption figures if you're overclocking"
 
Should have a great overclocking potential with those power figures.

Impressive, most impressive. Should be a delicious prospect for Sli. Less heat and only 350W out of the wall!
 
Even after flashing a custom BIOS, my GTX 780 still throttles when running Furmark (and that's with the Power Target at 300% and the temps hovering at only 65c). There's simply no getting around it.

Like I said: "A more efficient card = less throttling = your overclock actually means something. You should be overjoyed at these power consumption figures if you're overclocking"

Meh, Furmark is a terrible stress test anyway. It maxes out temps, but there's a ton of things more stressful.
 
Back
Top