Power

dracop

Limp Gawd
Joined
Dec 8, 2005
Messages
133
I mean the electrical kind not epeen in game kind.

I really feel the power draws are getting ridiculous on video cards. Every other subsystem is going down in power but we are forced ot conside rupgrading PSUs to ever higher wattage with a constantly shifting selection of power connectors.

CPUs have made huge strides, hell even RAM has made hug strides ala DDR2 in terms of becomign mroe efficient power wise.

I know the HardOCP people dont give a damn, they've made ocmments to that effect. But not only the cast of the PSU but also the cost of that expanded electrical draw, the heat issues it brings, etc. is just getting otu of control.

Intel/AMD have managed to continue building more powerful chips without building nuclear reactors; why is it ATI (AMD now) and NV get a carte blanche?

Is it the push to get us to photo realistic graphics? (we are pretty damn far). Mayhaps, 20 years down the line when we get there, they will focus on more pwoer efficient graphics systems.

What prompted this strange outburst you say? My power bill is what. I run an Antec 550w PSU. 550w/1000w x 0.17 (price per kwH) x 24 x 30 = $67 to run my computer. AND THAT 550W IS TOO WEAK TO PUT A NEW GRAPHICS CARD INTO ????? :mad: Now granted, I could program my computer into Energy Savings mode to reduce the total consumption (playing with that), still at $807/yr and hta tnot being enough, where will the line be drawn?

Im supposed to up to a 650 or 700w PSU, and waste more cash, just for the GPU. Not to mention the added expense of the PSU. Some of you may feel I am being cheap, and to an extent I am, but to be forced to spend $150 on a new PSU, and then an additional $240 per year JUST for the GPU?

Is it time for a Console as my priamry gaming machine? Maybe my cheap a** needs to go that way; cause this is getting silly b/w the h/w upgrades and the power. Whats worse is that current hardware has TONS of headroom, but the gaming software industry atm is too lazy/cheap to program for it, preferring to just thrash out systems with unoptimized games.

A Day of Reckoning will be coming soon, A Day in which PC game makers have no profit margin. Mark my words, the Moorlocks will rebel, espeically with consoels getting as neat as they are.
 
It's just the nature of the architecture. With the G80 you have 'basically' a 128 core processor with each core running at 1.35 GHz (that's what, 172 GHz all together?), not to mention the other parts of the GPU such as the 32 ROPs running at 575 MHz each, and then the memory subsystem attached to it running a wide bus and at close to 2 GHz. Keep in mind the video card isn't just the GPU, it is the entire package, memory, etc....

As they work on better process technologies it will help greatly. Right now 80nm is the best in production, the Conroe on the other hand is 65nm, GPUs haven't reached CPUs in those efficiencies yet (neither has AMD yet for that matter :p ).

GPUs do employ techniques to reduce power such as transistor gating and multiple dynamic clock domains. Parts of the GPU are shut off until needed, and different components operate at different clock speeds, and the frequencies are different in 2D than in 3D. The power draw idle is different from full load.

The future will involve making the process more efficient and making more use of special gating/clock domain and other techniques to keep power and heat down.
 
Intel/AMD have managed to continue building more powerful chips without building nuclear reactors; why is it ATI (AMD now) and NV get a carte blanche?

Oh? Maybe it's just that I'm old, but I remember my first Intel chip being passively cooled with nothing but a heat sink, before that my cyrix didn't need any heatsink at all. Yes I know this is back in the 486 and early Pentium (200) days, but the point is CPUs have become MUCH bigger power hogs, look at the latest Pentium 4 chips, AMD FX series of cpus, hell the only backward stride that was made was the Core 2 Duo.

The fact is while they might not have needed nuclear reactors, they still needed the nuclear cooling towers, that translates to power used. GPUs are simply in that same boat, maybe someday some C2D like GPU will come out and they'll find a way to put raw mathematical computing power into a chip without requiring massive clock cycles to accomplish that task.
 
dont forget about the price, pretty expensive if u ask me!!!
keep in mind that today CMOS devices are pretty darn inefficient, less than 10% the rest goes to heat !!! yeah!!!
Is it the push to get us to photo realistic graphics? (we are pretty damn far). Mayhaps, 20 years down the line when we get there...

I think were only 10 years away of that ! damn i want to see a quantum GPU
 
This is the reason (Summer electric bill) I research my power supply purchase and found one with a supposedly high 80% efficiency rating.
 
Hell, back in the 8086 days I dont even recall that the CPU even had a heatsink. I certainly didnt have a cooling fan on my 386.

Have CPUs gotten worse over 20yrs, yes they have. But the last 4 years has seen both AMD and Intel move to lower power chips that simply use electrcity more efficiently.

AMD's AM2 design came with some power saving features as did Intels C2D.

But while CPUs, RAM, and mobo (PCIe spec) dev seems to be headed down the efficieny path, at the laest, its considered desirable, GPUs are doing the opposite and wiping out the gains from Intel/AMD.


I think they need to come up with a better way to generate graphics personally; this mass brute force appraoch is leading us down a bad path. I think they will rapidly hit a point of diminishing returns if they just keep throwing electricy and ram at it. Maybe there is a point to DX10, altho I dunno that it really addresses the problem (havent seen how anyone is writing their DX10 code).

I think my upcoming rig, Im planning on a new PSU anyways, Ill look into getting the highest efficiency I can get, try not to waste so much power. Kinda irritating my sister pays half what I do in electric (same utility and area) and runs twice the crap (cept I run this box).
 
Quantum computing is obviously the next revolution step, other than that all they can do is go evolutionary.
 
un a few years eletrons would be replaced by photons! in computer chips that is
 
Back
Top