Software Progress Beats Moore’s Law

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to this article, a recent report by an independent group of science and technology advisers to the White House, says that software advances are driving computer innovation, not hardware.

There are no such laws in software. But the White House advisory report cited research, including a study of progress over a 15-year span on a benchmark production-planning task. Over that time, the speed of completing the calculations improved by a factor of 43 million. Of the total, a factor of roughly 1,000 was attributable to faster processor speeds, according to the research by Martin Grotschel, a German scientist and mathematician. Yet a factor of 43,000 was due to improvements in the efficiency of software algorithms.
 
Really? Then how come most applications other than rendering and movie encoding/decoding do not fully load more than 2 cores?

I call bullshit.
 
Doesn't technically break Moore's law. The hardware was always capable of those speeds - just the software was suffering Crysis 1.0 syndrome. :p
 
Wait, fixing bad code can speed something up more than throwing hardware at it? Who would have guessed

Seriously though. Its measuring two different things. Anyone could write the best algorithm right now, if they are able to think it up. It would always be the best algorithm and only compiler improvements to take advantage of newer processor architecture would ever be able to improve it.

I could design the best chip ever and it could be impossible to fab for a few more years. And then process technology would improve more and my design would be obsolete.
 
i thought software programmers were the ones holding back hardware engineers.
 
These guys probably aren't talking about consumer software, instead industrial/specialised software.

Saying that I didn't read the article so I my point is probably moot.
 
It's certainly possible to see this sort of increase over the last 15 years. I'll give you three very good reasons:

1. The major transition from interpreted to compiled languages happened in the 1990s. In the 70s/80s interpreted languages were king (majority of users), but by the early 90s they were holding new machines back. Once IDEs became useful, and processors were fast enough to compile programs in a reasonable amount of time, people moved to compiled languages.

2. The 1990s marked the first wide availability of advanced languages with standardized libraries. No longer was Joe Schmoe Programmer wasting his time and hurting performance by poorly implementing a sort function or linked list. Instead, you got the fastest-possible implementation for free, and that left Joe Schmoe Programmer more free time for SELF IMPROVEMENT (it was either improve or lose your job).

3. Programmers of the 70s/80s did not trust the floating-point unit (if there was even one available), so they did operations in fixed-point math that were extremely slow. The proliferation of pipelined, accurate floating-point available on every processor in the 1990s changed this trend, and increased performance impressively.
 
palin-whoosh.jpg
 
That government report is so stupid that everyone responsible for it should be shot in the head (at least if the news article is correct).

In the 70s and 80s, most professional programs were hand-coded in Assembly language. You don't get faster than writing to the metal. As hardware has became more powerful, programmers have shifted from more efficient (faster) coding to easier (slower) coding, such as from Assembly to C to C++ to Java. To make matters worse, as hardware becomes more complex, such as multi-core processors, programmers have a harder time using it efficiently.

But, ignoring all that, software algorithms becoming 43,000 times more efficient?? How ****** stupid is that on the face? "x=x+1" How can I make that 43,000 times more efficient?
 
Yeah, O.K... Fifteen years ago that production-planning software was about as high on the optimized software list as my heel is to the top of my head. So yeah, that particular piece of software has been optimized 43,000 times. And by the way Martin, hardware, specifically the CPU, has seen an increase of over 3000 times in clock speed alone. If you add all the advances to the instruction set(s) over the years it's even greater.

But, the conclusions are WAY off base here. It's NOT advances in software that is driving hardware innovation, it's users need for more information and questions answered faster that is driving hardware AND software innovation. Beleive me if people didn't need to transcode video or encrypt something the HARDWARE manufacturers wouldn't have added specific instructions to do those tasks faster.

Bullsh** flag on the numbers by Mr. Grotschel and most definately his conclusions. IMHO.
 
It's certainly possible to see this sort of increase over the last 15 years. I'll give you three very good reasons:

1. The major transition from interpreted to compiled languages happened in the 1990s. In the 70s/80s interpreted languages were king (majority of users), but by the early 90s they were holding new machines back. Once IDEs became useful, and processors were fast enough to compile programs in a reasonable amount of time, people moved to compiled languages.

2. The 1990s marked the first wide availability of advanced languages with standardized libraries. No longer was Joe Schmoe Programmer wasting his time and hurting performance by poorly implementing a sort function or linked list. Instead, you got the fastest-possible implementation for free, and that left Joe Schmoe Programmer more free time for SELF IMPROVEMENT (it was either improve or lose your job).

3. Programmers of the 70s/80s did not trust the floating-point unit (if there was even one available), so they did operations in fixed-point math that were extremely slow. The proliferation of pipelined, accurate floating-point available on every processor in the 1990s changed this trend, and increased performance impressively.

Well you contradict yourself here, the last 15 years started in 1996. That seems to be after all the advances in computer programming you make in your three points. Try again.
 
Article from December
http://agtb.wordpress.com/2010/12/23/progress-in-algorithms-beats-moore’s-law/

Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later – in 2003 – this same model could be solved in roughly 1 minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008.
 
The new Photoshop runs like molasses in my brand new box, Photoshop 6 runs faster on a 5 year old machine.
 
Well you contradict yourself here, the last 15 years started in 1996. That seems to be after all the advances in computer programming you make in your three points. Try again.

And you have no idea how the computing world works, do you?

A new hardware feature can be introduced, but it doesn't get integrated into most programs until it's widely and cheaply available. See my note on the FPU: 1993 saw the release of the P5 with a pipelined high-performance FPU, but the prices on those processors were ungodly expensive until about 1996. 1997 saw the release of consumer-level P6 processors, which also increased FPU performance significantly (and dropped the P5 prices). Once these were in-place, software could be upgraded (say a year or two for that to roll-out), and you're looking at about the year 2000 before most software makes extensive use of the FPU on the majority of consumer platforms.

As for other features I mentioned, they all have lead times before they get integrated because every software house has delays in accepting entirely new concepts:

* large programming houses have processes that are held sacred above all else, and people in control who fear change. If you get these people to adopt completely new software development paradigms in less than five years, you're a miracle worker.

* small programming houses do not have the money or manpower to migrate to new development paradigms and environments on a whim. They have a very common problem where the code maintenance team is small or non-existent, so porting code to a new platform requires spin-up time from some new employee, and often results in a failed project with money wasted.

I guarantee you that most big software houses spent most of the 1990s fighting STL and modern language adoption, and the smaller companies with custom software decided it was cheaper to throw faster hardware at the problem than go through the pain of a redesign. But most big software houses adopted new coding practices by the early 2000s, and after hardware performance hit a wall in the last decade, even small businesses have accepted that the software must change.
 
Despite the fact that these numbers reek of coming from a very contrived case/situation (like directly inverting a large, sparse matrix vs. using some newer FFT-based technique, or simply one with better conditioning). There seems to be an inverse relationship to the rate of increase in the 'speed' (efficiency) of software and speed of hardware.

When computations were done by a 5x5 array of grad students/secretaries on adding machines, you learned to optimize your software... (also led to the development of some good parallel computing techniques).

GET OFF MY LAWN!
 
And you have no idea how the computing world works, do you?
You have absolutely no idea how long I've been involved in the computing world. Sorry you lose this point, too.

....and the smaller companies with custom software decided it was cheaper to throw faster hardware at the problem than go through the pain of a redesign. But most big software houses adopted new coding practices by the early 2000s, and after hardware performance hit a wall in the last decade, even small businesses have accepted that the software must change.

Well you missed the point of my posts. But you made my argument for me, ty. Software lags behind hardware. It's hardware innovation that drives software change, not the other way around.
 
Well you missed the point of my posts. But you made my argument for me, ty. Software lags behind hardware. It's hardware innovation that drives software change, not the other way around.

No, it's users need for more information and questions answered faster that is driving hardware and software innovation.
 
No, it's users need for more information and questions answered faster that is driving hardware and software innovation.

What, you can't put 2 and 2 together, users drive change, hardware changes first, then software.

Does that spell it out enough for ya?
 
Hmm after looking at it closer, the study is actually fine. The NY Times blogger just made a sensational headline attributing the improvement of algorithms to software in general. Obviously software in general is still getting more and more bloated all the time, not the reverse.
 
What, you can't put 2 and 2 together, users drive change, hardware changes first, then software.

Does that spell it out enough for ya?

Virtualization, 3D graphics, transcoding, iSCSI, TCP/IP, encryption, virtual paging, disk as tape, VPN, CAD, so on so on.

Yep none of the advances there started in software. :rolleyes:
 
Obviously software in general is still getting more and more bloated all the time, not the reverse.

Software bloat is a separate issue than whether it is getting more efficient computationally.

It's entirely conceivably that software improvements make significantly larger gains than hardware improvements.
 
Back
Top