Computer Chip Sales Cool Down in Summer

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Microprocessor sales are down this quarter? That’s odd, the economy is doing so well I can’t figure out why people aren’t spending more money.:rolleyes:

For the third quarter, worldwide microprocessor sales rose only 2.5 percent from the second quarter, while shipments inched up just 2.1 percent from the prior quarter. On a year-over-year basis, results were healthier, with chip sales rising 24.1 percent and shipments 8.6 percent over the third quarter of 2009.
 
Sorry, it's all my fault.
I've been too busy to upgrade the old system my wife is using, and I just keep redeploying all the old P4's at the office instead of buying new systems.

Maybe next year.....
 
Whats the point, damn things are too powerfull. Had my i7 a year now and still nothing I do pushes it to 100% usage.

Since no one was spending money for the last couple years all the companies when way out of their way to make their products so irresistable that we wanted to plop down a couple hundred dollars. As a result we got awsome CPUs where 4 cores in the norm, super fast SSD's that have dropped hugely in price and gotten hugely faster, 4GB DDR3 modules are under $100, we have 3TB platter drives, 28" monitors for $300 and 30" 2560x1600 monitors for $1000. Insanely fast video cards that pull more power then all the rest of the computer. All the hardware got over the top awsome the last couple years and yet what is there to do with it now?
 
we got awsome CPUs where 4 cores in the norm, super fast SSD's that have dropped hugely in price and gotten hugely faster, 4GB DDR3 modules are under $100, we have 3TB platter drives, 28" monitors for $300 and 30" 2560x1600 monitors for $1000. Insanely fast video cards that pull more power then all the rest of the computer. All the hardware got over the top awsome the last couple years and yet what is there to do with it now?

Same here. I have an i7 overclocked to 3.5Ghz, and other than benchmarks/burnin programs, I've only found 1 app that can push is above 50%. It's a video compression app, and it will hit 85% when decompressing a DVD & recompressing it in MP4 format at the highest quality settings. Even with a 2 pass encode, a 2 hour movie is finished in less than 20 minutes :) Plus I can still burn a disk, check email, surf the web all at the same time.
 
Agreed

Powerful quad cores has been affordable for users who require lots of processing power, and for those who doesn't, current budget dual cores are more than enough for them.

For the huge majority of users out there, there isn't any application that pushes current system to the limit.
 
What's the point, computers are already so powerful, who needs to upgrade? We've got CPUs that run at 1 GHz, 256MB of RAM, 20GB hard drives, and videocards that use DDR memory with several gigabytes/s of memory bandwidth, and GPUs that actually offload geometry processing from the CPU!
 
Yeah, I don't think the ma and pa's out there are upgrading their computer past a dual core. I don't think AOL needs much more to run. : )
 
What's the point, computers are already so powerful, who needs to upgrade? We've got CPUs that run at 1 GHz, 256MB of RAM, 20GB hard drives, and videocards that use DDR memory with several gigabytes/s of memory bandwidth, and GPUs that actually offload geometry processing from the CPU!


No this is not the same. I was there, I ived through that. Back then, in the 90's the hardware was being pushed to limits even when new.

Memory was at a premium and disk caching was a regular occurance. You had to think about what when on your harddrives because they weren't excessive yet. Video cards were being benchmarked at resolutions that the average person actually had a monitor capable of playing at. Processors could do stuff faster, but you could still load them up.

The average user can't do that today with the average hardware that is availabe. I *only* have 6GB of memory and there is ony one thing I have ever done that came close to using all that. Most video cards are benchmarked at 2560x1600 and still manage nearly 60FPS, yet how many people actually have a 30" monitor? Who can get a core i7 at 100% usage? You actually uses 200-400MB of read speed off a SSD. I'm talking regular users here....thats the majority. The market can't rely on folders and torrenters.

And all this stuff is cheaper then it was 10 years ago. I think its all great. I love where we are hardware wise, but we need to see personal computers start giving us user experiences like [James Cameron's] "Avatar" or something. Not more resolution....more detail. We have the space and power, use it.
 
Also I would be happy with more advances in multi monitor...like monitors with no edges or ones that can snap together. The video cards are willing.
 
So let's say all you ever do on a computer is use Notepad, Calculator, a basic internet browser, Putty for SSH, and that's it. Let's go back 10 years.

Now, slowly go through time and take into account the advancements made in technology for superior hardware.

Do you really need an octocore i7 at 3.2GHz with a 12MB cache and 8GB DDR3 RAM with dual gigabit ethernet ports?

At some point in "the curve" you don't need more RAM, a more powerful CPU, a better GPU, etcetera for the things you do.

For most businesses, you probably don't need anything more than one of those ASCII-based/SSH-based interfaces, and in some scenarios, your basic software. At a certain point, the hardware will meet the point where more would only be unnecessary, non-beneficial excess.



So.. based on that, where do you think the electronics/computer and IT market is going to go? ;)

Are they going to be making operating systems that require 16GB of RAM, a 250GB SSD, the latest and best GPU, and dual-gigabit (literally requiring TWO, not ONE, and GIGABIT and not 100mbit)?
 
So let's say all you ever do on a computer is use Notepad, Calculator, a basic internet browser, Putty for SSH, and that's it. Let's go back 10 years.

Now, slowly go through time and take into account the advancements made in technology for superior hardware.

Do you really need an octocore i7 at 3.2GHz with a 12MB cache and 8GB DDR3 RAM with dual gigabit ethernet ports?

At some point in "the curve" you don't need more RAM, a more powerful CPU, a better GPU, etcetera for the things you do.

For most businesses, you probably don't need anything more than one of those ASCII-based/SSH-based interfaces, and in some scenarios, your basic software. At a certain point, the hardware will meet the point where more would only be unnecessary, non-beneficial excess.



So.. based on that, where do you think the electronics/computer and IT market is going to go? ;)

Are they going to be making operating systems that require 16GB of RAM, a 250GB SSD, the latest and best GPU, and dual-gigabit (literally requiring TWO, not ONE, and GIGABIT and not 100mbit)?
Sort of like this (and yes, I know it's a very bad illustration):

pc_hwperf_timeline.png


Going back to the mid-1990s to the early-00s, remember how speedily hardware (RAM, HDDs, CPUs, motherboards) were improving? Remember the days when 128MB was considered high-end? 512MB (what was this, like the very-late 90s and early 00s?)? 1GB-2GB (2002-2003?)?

The actual performance gain decreases the more you get. And especially, at a certain point for just basic functions you get to the point where you really don't need more (and if you did get more, it would be mostly unnoticeable, a very small actual performance gain, etcetera).

Most users (consumers) would need something like 1.5GHz dual-core, 2GB RAM maximum, and a basic GPU (ie. your onboard Intel/ATI/NVIDIA).

Even at 4GB (compared to 2GB) it doesn't seem to be a big performance gain. Back over the decade, remember how having 512MB RAM total (when 512MB was considered high-end) made such a huge difference over whatever the average was? And as each thing came out, this "huge difference" became less and less and less...

Unless operating systems like Mac and Windows boost up requirements unnecessarily (which I would say has been happening, especially with Vista/7) and stuff like that, a 1.5GHz dual-core/2GB/basic GPU system alone should cost only like $200 tops (and I mean, a decent, stable, reliable system -- today's hardware is proof enough).
 
Ya pretty much. I even do high end shit, and I'm happy with my Core 2 Quad. I do digital music synthesis, like big virtual instruments and effects and so on. Hits the CPU much harder than normal apps... And the C2Q handles it. I suppose if I have a whole lot of effects I might be able to get CPU bound before I get IO bound, though not likely, but then I can just bounce a couple tracks, which takes almost no time thanks to the fast CPU, and go back at it.

I want a new CPU, just because I'm a geek, and will probably get an Arrandale, but I totally skipped the i series not because they aren't rocking processors but because there is just no need, the C2Q is ALSO a rocking processor.

It isn't even a matter of basic functionality, it is a matter that old system are good, they are high performance. You can do fairly high end powerful stuff and still have excellent performance and not need anything new.
 
Still running dual core here!! Guess it's time to update to 6 core!! Mobo is flashed and ready, but my wallet isn't!!
 
It's so strange sales haven't picked up since there has been nothing new in 2 years, lol.

Seriously, the i7 has been out for 2 years now with no replacement or real upgrades.
 
With Christmas around the corner, next we will see chip sales up. I know about a dozen people looking to upgrade some rather machines. Sure they won't be buying the top of the lines, but it should help clear out some older stock.
 
The people that only browse and watch videos can lag behind in performance by 8 years, but it's also a moving target. There will always be more interactive websites and higher def videos (not video games, which are obvious cutting edge), and also power cutting for mobile devices, which is also advancement in another direction.
 
Back
Top