Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I like my 2500k idling at 1.6GHz and then jumping to 4.2GHz when I open up a game. I wish I could get the voltage to drop down at idle too again.
my mobo has no offset so I just had to leave it on auto which is what he should do. only problem is that auto will go a bit higher than needed. of course that's probably better than idling at higher voltage than needed for the 95% of the time when his pc is not playing games though.If you use an offset (or Auto) voltage it should down-volt also. Only a manual voltage setting will keep it at the fixed voltage.
Bump if you agree and you feel that the trend of energy saving processor introduces latency that you are fed up with!
In 1997 Intel released the Pentium 2 processor running at 233Mhz. It operated at 1.9-2.1v, was rated at 23.7W, had 7.5 million transistors and a maximum operating temp of 65C.
In 2004 Intel released the Pentium 4 processor running at 2.8-3.4Ghz. It operated at 1.25-1.4v, was rated at 89W, had 125 million transistors and a maximum operating temp of 69C.
This year Intel released the 3960X running at 3.3-3.9Ghz, It operates at .60-1.35v, is rated at 130W, has 2.27 billion transistors and a maximum operating temp of 90C.
23W to 130W. Wheres this energy savings?
When they drop the size, they increase the number of transistors and the clock speed, hence a faster chip, but its using more power or at least the same power as before. They do cool things like C1E and Speed step to save power. 5 years ago a 650W power supply was big. Now, it will barely power the fastest video cards and CPUs. Pretty sure there's no "power savings" or "energy conservation" going on in the computer industry, unless your talking about laptops or portal devices.
I read Pitbull's post as being for Vyedmic's benefit due to the props to C1E and Speed Step.
Honestly, I could give 2 shits weather or not my computer saves power. If I want to save power, I'll shut it off. Yes, the new additions to Intel like power savings, C1E, Speed Step and the like are nice, yes, I use them, but at the end of the day I couldn't give a shit weather or not it uses 600w of power or 1000w of power. I just want it fast as shit!
You can't tell me that 95% of the people around here wouldn't FLOCK to buy the next processor if it was TWICE as fast as we have now but used TWICE as much power.
At the end of the day the power saving bullshit is nice, but people care alot more about performance.
It's a little thing called performance per watt, it's about energy efficiency not total power consumption. Gimp and down clock that IB cpu to that P2 speeds, and it will get raped by the IB, you have to look at the full picture.
In 2008 IBM's Roadrunner supercomputer achieves 376 MFLOPS/Watt.
In 2011 IBM NNSA/SC Blue Gene/Q Prototype 2 achieved 2097.19 MFLOPS/Watt.
http://en.wikipedia.org/wiki/Performance_per_watt
Nope, no improvements here.
Not everyone here is into max power. While about everyone here are enthusiast, not all are looking for max performance, some are all about how much they can get out of small systems or efficient systems, or silent systems etc etc. Power use does not matter much to me, everything is included in my rent. However, I do care about power use still, but I also care about noise and other things as well, it also reaches a point for some people that the cost of power is just not worth it, big time for the people who run servers and the like.
I won't argue that computers today are probably more efficient then 10 years ago or even 5 years ago, but they also use alot more power, (and I don't see anyone around here using an IBM supercomputer).
I would like to see someone do a study on performance per watt though, it would be interesting to see. I had a Pentium 2 450Mhz with the first GeForce 256 card they made back in the day and ran Windows 95 on it. I played all kinds of online games with it. The entire thing ran on a 380W power supply and that was big back then.
Right, I can defiantly see a market for it, especially in the server segment. But I assume we're talking about the home desktop market here. Most people at home don't have dozens or even hundreds of machines in a data center where saving just a few watts per box would be a huge savings every month or year.
It's different scale but the same principle. And running only a single computer at home which is efficient will make you larger monthly savings and ofc stack it to years.Right, I can defiantly see a market for it, especially in the server segment. But I assume we're talking about the home desktop market here. Most people at home don't have dozens or even hundreds of machines in a data center where saving just a few watts per box would be a huge savings every month or year.
In 1997 Intel released the Pentium 2 processor running at 233Mhz. It operated at 1.9-2.1v, was rated at 23.7W, had 7.5 million transistors and a maximum operating temp of 65C.
In 2004 Intel released the Pentium 4 processor running at 2.8-3.4Ghz. It operated at 1.25-1.4v, was rated at 89W, had 125 million transistors and a maximum operating temp of 69C.
This year Intel released the 3960X running at 3.3-3.9Ghz, It operates at .60-1.35v, is rated at 130W, has 2.27 billion transistors and a maximum operating temp of 90C.
23W to 130W. Wheres this energy savings?
My guess is that even a very low power Atom processor would kick that P2-233 right in the nuts, and do it at a lot less than 23W. Heck, even the chips in smartphones or tablets are probably a match for that. Anyone know how many watts a Tegra 3 uses?
Edit: I checked, looks like the Tegra 3 uses 1-2W at load. So I think that pretty much blows up your "no power savings" argument.
I think a lot of you would be shocked at how high the demand for lower power consuming processors is.
My guess is that even a very low power Atom processor would kick that P2-233 right in the nuts, and do it at a lot less than 23W. Heck, even the chips in smartphones or tablets are probably a match for that. Anyone know how many watts a Tegra 3 uses?
Edit: I checked, looks like the Tegra 3 uses 1-2W at load. So I think that pretty much blows up your "no power savings" argument.
Pretty sure we disregarded the whole tablet/mobile/laptop segment right from the start here... we'll let you catch up.
My guess is that even a very low power Atom processor would kick that P2-233 right in the nuts, and do it at a lot less than 23W. Heck, even the chips in smartphones or tablets are probably a match for that. Anyone know how many watts a Tegra 3 uses?
Pretty sure we didn't. Who cares what the form factor is, the undeniable fact is the performance/power has increased dramatically, even if total power consumption (in some cases) has not. That Atmo can run a server as well or better than a P2-xxx, and do it at a fraction of the power, so how is that not relavent? The iPad can play games that that P4 system could only dream of, so how is that not relavent?
In 1997 Intel released the Pentium 2 processor running at 233Mhz. It operated at 1.9-2.1v, was rated at 23.7W, had 7.5 million transistors and a maximum operating temp of 65C.
In 2004 Intel released the Pentium 4 processor running at 2.8-3.4Ghz. It operated at 1.25-1.4v, was rated at 89W, had 125 million transistors and a maximum operating temp of 69C.
This year Intel released the 3960X running at 3.3-3.9Ghz, It operates at .60-1.35v, is rated at 130W, has 2.27 billion transistors and a maximum operating temp of 90C.
23W to 130W. Wheres this energy savings?
When they drop the size, they increase the number of transistors and the clock speed, hence a faster chip, but its using more power or at least the same power as before. They do cool things like C1E and Speed step to save power. 5 years ago a 650W power supply was big. Now, it will barely power the fastest video cards and CPUs. Pretty sure there's no "power savings" or "energy conservation" going on in the computer industry, unless your talking about laptops or portal devices.
Its hard reading post on a messageboard.
The fact that you can turn off power savings features is irrelevant, I think the point he is making is that if all of the research and development that went into these power saving features instead went into making fast chips, the current offerings would be much faster than what intel is actually releasing.
For example, I recall a lot of people were recently lamenting that ivy bridge would not have a consumer grade 6-core chip and conjecting that if ivy bridge was not so concerned about power saving there would likely be a 6-core chip.