Has The PC Hit Rock Bottom?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
OH NO! Has the PC finally hit rock bottom? :rolleyes:

The computer industry has been in a prolonged slump. This week things went from bad to worse as both Intel and AMD reported declines in business at a time when their customers should be stocking up for the holidays. It's still early, but so far faster chips inside better-designed systems running a new operating system seem to be having little impact. The question is where do we go from here?
 
Another day, another doom and gloom article by a "tech journalist" who does all of their computing from iPad.
 
Doesn't surprise me. It's why the PC makers are all about portables now. Outside of a select few apps (CAD3D, Games, UHD+ Film Editing), there's really no reason to upgrade. I'm still running an original i7-920 @ 3.8 (x58 mobo) and it does it all for me (GFX editing, video editing, 3d modelling, gaming). Only significant upgrade I needed in the past few years was an SSD and GFX card.

Sure, I can upgrade, but for what? To have better encoding times on videos? My workflow isn't hindered at the moment.

Now take that and compare it to what your average user would use (Browser, Office, 1080p video if that) and they haven't felt a need to update for even longer.
 
1. New versions of Windows used to spur hardware upgrades due to higher requirements, but the last several versions, from Vista all the way up to 10, have had the same requirements and mostly the same footprint. So, old systems from around the time Vista came out still run Windows 10 nearly as well as modern systems.

2. Intel has stalled processors. Their latest version, Core i7-6700k, is only a small upgrade over the old Core i7-2600k. Newer processors use a whole lot less power, but when the performance isn't changed much, people don't buy new ones.

3. AMD has not provided any competition for Intel to spur processor development. Without that, Intel stalls.

So, the fault clearly lies with Microsoft, Intel, and, to a lesser degree, AMD.
 
PC's and PC hardware is actually booming. Considering the game market is now bigger & makes more money than the Film industry, I think its safe to say that PC's and people who buy parts for their PC's to play these games are going to be around for a long , long time.
 
#1 Hardware is not getting faster. You have Intel with their super not upgrade with the Skylake chips, using the same amount of cores as they've been doing for nearly a decade. You have AMD/Nvidia with incremental upgrades over their previous generation of graphic cards.

A 2500K using a GTX 780 will still be a bad ass machine. Laptops are also getting too hot to handle and break easily because of it. Also Intel CPUs are pretty much dual core unless you buy a high end i7, cause some i7's are dual core. WTF?

#2 Windows 10 sucks. I don't give a damn how glorious you think it is. Too many questionable practices in Windows 10 makes me wonder if Microsoft is aware of my porn collection. Also the upgrade brings driver issues which for a modern OS you'd think that's an old dead issue by now. Certainly Linux doesn't have legacy driver issues cause they use open source.

#3 I don't give a damn about Surface Pros. WTF is this thing? A tablet? A laptop? It's stupid as hell. Hows the hardware? Sucks shit. Hows the storage? Sucks shit. The IPS screen is great but I don't care for that. The price? Sucks shit. The kick stand? Sucks shit.

Just make better faster and cooler laptops. Seriously just put water cooling or something cause modern laptops throttle a lot. Put discrete GPUs and CPUs with 4 cores in them already. Why do I need to explain this in 2015?

Also look at this. The Surface Pro 4 has the kick stand still? Why?
en-intl-pdp0-andromeda-devices-su3-00001-p1.jpg
 
2. Intel has stalled processors. Their latest version, Core i7-6700k, is only a small upgrade over the old Core i7-2600k. Newer processors use a whole lot less power, but when the performance isn't changed much, people don't buy new ones.

3. AMD has not provided any competition for Intel to spur processor development. Without that, Intel stalls.

Mainly this.

I'm on a 3 year replacement cycle for laptops at the office. While the new models draw less power, they are no faster than the ones we bought 3 years ago, and those where only about 15% faster than the ones 3 years before that. A 15% improvement over 6 years is not the way to sell more hardware.

We use high end laptops, (fast dual core, 16GB ram, 15" 1080p screens) but instead most newer laptops seem to be going the other way, smaller screens, lower resolution, slower CPU's, limited to 8GB ram (single stick), smaller batteries which offsets most of the energy savings.
 
I have to agree about the Surface Pro. I've never understood its appeal. It's too heavy to be a good tablet, and can't be used easily on person's lap like a laptop. The only easy way to use it is on a table with the kickstand out, in which cases it's competing with desktops, but everything about a good desktop is going to be way, way better than Surface.
 
Agreed on the CPU performance stall. I priced out a new build, seeing as how my current desktop is 6 years old. I have an ancient Athlon x3 435 2.9GHz. I got it with a motherboard for <$100 6 years ago at microcenter. If I look at similar prices nowadays, performance has only increased by maybe 50%. There just isn't any strong reason to upgrade. So people are hanging onto their computers longer.

There was a time of huge GPU advancement, but even that has stalled. A few months back I ditched my HD3850 and got a used GF 660 Ti, and got an 8x improvement for that 5-year gap. Going from a 660 Ti to a 960 (a 2.5 year gap) only gets you about, what, 25% more?
 
Funny thing is, if Intel would sell us a subscription to a self-destructing Extreme Edition CPU for 1/5 their normal price (disables itself after 18 months and you can send it back to recycle for a credit towards your next subscription, or they convert it into a special sku for RMA exchanges using special reactivation instructions), some of us would be happy to just keep buying them and staying on the bleeding edge. Kind of like disposable contacts cost vs the old kind. They could keep their production lines running all the time, and we just get speed bumps every cycle. They'd have to stick to the same socket a bit longer though unless the whole industry adopts this approach. Actually, I wish Nvidia would do that, but they don't have excess capacity problems.
 
I just picked up a used laptop with a 2nd-Gen Core i7 and a FirePro 8900m, upgraded to an SSD and maxed out RAM at 16GB, all for less than $600. The machine is more than fast enough for probably 98% of computer users, and likely will remain so for at least a couple years. It'll easily hold its own against just about any new laptop / desktop within its price range. And with the FirePro m6100 I'll be purchasing for it in the next year, it will stay relevant even a bit longer.

Except for the select few that actually push their machines to 100% capacity on a regular basis, there is no reason for most to buy a new machine. The tech industry though got spoiled on the need to upgrade regularly for the previous 3+ decades as hardware and software rapidly evolved to meet the demands of users, and are too stubborn to change their business models from a reliance on a shortened upgrade cycle that will be necessary going forward. Instead of providing truly useful new features to make the world easier and better, they resort to artificial means to create situations where upgrades are necessary. For instance, if you like Outlook / Exchange and you have a Windows 8 or newer machine, you NEED Office 2013, as 2010 and older strangely have a compatibility issue that Microsoft refuses to fix, and I have a hunch that it isn't coincidental. Cell phone manufacturers have been doing the same thing since the early 2000's. There's no reason why my smartphones should perform crappy after less than 2 years when my nearly 20-year old StarTac would still be working just fine if it was compatible with the new radio standards.

I'd say the market demand is tapering off, not slumping. Slumping to me promotes the assumption that there should always be a need to upgrade on a short cycle, and promotes companies' arrogance that they are entitled to a constant influx of consumer $$$. Give us reasons to WANT to upgrade, instead of making profit-margins the highest priority. People like toys and have no problem replacing them with something newer when there is a tangible benefit.
 
Doesn't surprise me. It's why the PC makers are all about portables now. Outside of a select few apps (CAD3D, Games, UHD+ Film Editing), there's really no reason to upgrade. I'm still running an original i7-920 @ 3.8 (x58 mobo) and it does it all for me (GFX editing, video editing, 3d modelling, gaming). Only significant upgrade I needed in the past few years was an SSD and GFX card.

Sure, I can upgrade, but for what? To have better encoding times on videos? My workflow isn't hindered at the moment.

Now take that and compare it to what your average user would use (Browser, Office, 1080p video if that) and they haven't felt a need to update for even longer.

This. I was looking at updating my office PC...to get USB 3 headers. Performance wise an i3-2100 for web, email, and general tinkering is just fine.
 
I just picked up a used laptop with a 2nd-Gen Core i7 and a FirePro 8900m, upgraded to an SSD and maxed out RAM at 16GB, all for less than $600. The machine is more than fast enough for probably 98% of computer users, and likely will remain so for at least a couple years. It'll easily hold its own against just about any new laptop / desktop within its price range. And with the FirePro m6100 I'll be purchasing for it in the next year, it will stay relevant even a bit longer.

Except for the select few that actually push their machines to 100% capacity on a regular basis, there is no reason for most to buy a new machine. The tech industry though got spoiled on the need to upgrade regularly for the previous 3+ decades as hardware and software rapidly evolved to meet the demands of users, and are too stubborn to change their business models from a reliance on a shortened upgrade cycle that will be necessary going forward. Instead of providing truly useful new features to make the world easier and better, they resort to artificial means to create situations where upgrades are necessary. For instance, if you like Outlook / Exchange and you have a Windows 8 or newer machine, you NEED Office 2013, as 2010 and older strangely have a compatibility issue that Microsoft refuses to fix, and I have a hunch that it isn't coincidental. Cell phone manufacturers have been doing the same thing since the early 2000's. There's no reason why my smartphones should perform crappy after less than 2 years when my nearly 20-year old StarTac would still be working just fine if it was compatible with the new radio standards.

I'd say the market demand is tapering off, not slumping. Slumping to me promotes the assumption that there should always be a need to upgrade on a short cycle, and promotes companies' arrogance that they are entitled to a constant influx of consumer $$$. Give us reasons to WANT to upgrade, instead of making profit-margins the highest priority. People like toys and have no problem replacing them with something newer when there is a tangible benefit.
I think that makes complete sense from a consumer perspective. However, Intel is currently (successfully) satisfying the need of the Amazons/Googles/Microsofts of the world who replace everything every 2-3 years because the TCO from the lower energy consumption in the cloud justifies the upgrade and they can squeeze more servers into the same footprint.

Most consumers don't care about Energy/HVAC and carbon emissions enough to structure their purchases around that (unless it's a price @pump MPG and even then they seem to be tolerant and I'm also guilty as charged) but in a sense, Intel focusing on power reduction is good for the world. But as consumers, we simply don't care. I'm happy buying dirt cheap sandy and ivy bridge Xeon parts abandoned by the industry for less than a new Skylake board, even though i'm not helping with the global warming problem. Until they somehow penalize the operation of inefficient hardware (aka like the way smog tests and European annual auto checkups essentially do), or provide us a carrot (subscription upgrades), i don't think as consumers we'll be changing our ways.
 
I think the issue is still a lack of compelling software. In the early days of the PC ramp Andy Grove used to talk about the technology spiral (software required new hardware to meet the requirements which drove new hardware releases which drove new software advances and on and on). Except for a few high end games there are few compelling reasons to upgrade to new hardware every year (or even every few years). You use the hardware until it breaks and THEN you buy new hardware.

If there was a compelling new application that could only run efficiently at 5 GHz with dual GPUs then we would have systems that offered that and at a price point that made it palatable for mass consumption. However, Netflix runs pretty much the same on a 5 year old system as a new one, email runs on systems decades old, and internet surfing can be done with a minimal system configuration.

Will Oculus, or Home Automation, or some other capability drive new hardware purchases ... possibly ... but we definitely don't have any software or supporting hardware that needs substantial power (except for very narrow niche needs) ... I think that is what needs to change before we see a big boom in purchases
 
That there are now Microsoft branded tablets, phones, and a laptop is proof that the PC hit rock bottom.

Is this bottom enough for the PC to admit that it has a problem though or will it have to find new bottoms?
 
Not the end, but definitely nearing a big change in CPU tech and programming techniques.
 
Most consumers don't care about Energy/HVAC and carbon emissions enough to structure their purchases around that (unless it's a price @pump MPG and even then they seem to be tolerant and I'm also guilty as charged) but in a sense, Intel focusing on power reduction is good for the world. But as consumers, we simply don't care. I'm happy buying dirt cheap sandy and ivy bridge Xeon parts abandoned by the industry for less than a new Skylake board, even though i'm not helping with the global warming problem. Until they somehow penalize the operation of inefficient hardware (aka like the way smog tests and European annual auto checkups essentially do), or provide us a carrot (subscription upgrades), i don't think as consumers we'll be changing our ways.
Mobile computing consumers do care about heat and energy required (no appeal needed to global warming or pollution). Intel is still making architectural improvements and adding instructions.

Your general point still stands though. To begin with, Moore's law was never that processors would be twice as fast every 2 years, just double the amount of transistors. Intel has revised this to more like every two and a half years this year.

Intel still produces higher wattage parts, it's just not the core of their business.
 
I used to build a new machine every 6-12 months to keep up with performance. I haven't built one in 3 years now. Why bother? The advances in processor speed/performance is totally meh over the last 3 years. Hell, the original i7-920 is still more than fast enough for a typical desktop!

My main desktop is a 2600K, The game machine is a 3770K. Desktop has a 7970 and the Game machine has a 290x.

Don't need new machines. The processor is far from being the bottleneck any more. A new SSD here, a new Graphics Card there, and a machine can go a good 5 years plus. Even for us Geeks.

And I sure as hell don't see the need/cost of DDR-4.
 
Oh, and btw... They are pissing me the hell off with all the socket changes that are totally unnecessary. Leave the fucking socket alone! We don't need new motherboards every damn year, and good one's are getting expensive !

I'd just like to upgrade the processor every now and then. Maybe.

They created this mess. Don't look at us.
 
That there are now Microsoft branded tablets, phones, and a laptop is proof that the PC hit rock bottom.

Is this bottom enough for the PC to admit that it has a problem though or will it have to find new bottoms?

Microsoft's entrance into the PC market was certainly driven in part by the slump in PC sales but there's more to it than that. Apple has iconic laptops and desktops, renowned for their engineering quality by many thus the price levels that Apple can charge. There was nothing like it in the PC world. Where was that iconic PC, that household name of a PC that was cool and lust worthy?

This has simply never existed in the PC world that's for so long been too full of cheap plastic junk. The number of new PCs sold will long term will probably settle in at a much lower number than the historic highs but I think PC makers can make more profit on the machines they do sell if they push the envelope with good designs and focus a little less on pushing large volumes of cheap plastic junk which they aren't going to sell as much of anyway in this day and age.
 
Mobile computing consumers do care about heat and energy required (no appeal needed to global warming or pollution). Intel is still making architectural improvements and adding instructions.

Your general point still stands though. To begin with, Moore's law was never that processors would be twice as fast every 2 years, just double the amount of transistors. Intel has revised this to more like every two and a half years this year.

Intel still produces higher wattage parts, it's just not the core of their business.
Mobile computing needed efficiency to get past the first hurdle, meeting about 8-10 hours of battery life so you can go wireless in a single work day, and for business class laptops and macbooks, it's pretty much done that. After that it's just weight reduction and we're already getting pretty thin-and-light that I don't see people replacing their high end macbooks for the cpu/formfactor improvements but rather for screens, better SSD tech and sometimes gpu performance. While i'd love to see power efficiency gains give me eventually week-long battery life but it's more likely they'll stick with 10 hours and keep shrinking the battery (just like all cars have roughly 300mile range despite differences in MPG). In 4 years, if we can already get down to a 30wh battery for a 10 hour business laptop, and they announce a massive 50% power reduction in the following generation, i don't think anyone would upgrade just so they can shave off another 1.5 ounces in their laptop, every magazine will announce their ho-hums, and nobody will buy new laptops until their non-replaceable batteries wear down. Cloud customers however will drool at the prospect of doubling instance capacity in the same footprint and double up on orders (further pushing more of Intel's profits into the cloud segment) but every magazine will complain they aren't bringing anything new to the table for the consumer.

If consoles die with the coming of VR gaming, perhaps things will change, but even today, I can run 2 AAA titles simultaneously on a Nehalem i7 (running a baremetal hypervisor and two high end video cards in passthrough at 1080p). I'm not quite convinced that VR requirements can't be met with just the current performance envelopes on the CPU side, and GPUs aren't tied too closely to the CPU generation (even current GPUs run close to full speed on pcie 2.0 x8). There's nothing really forcing consumers to upgrade over the next 2 cycles unless it's one of those revolutionary changes nobody can expect, or some kind of financial incentive.
 
I'm still using the 2500k I bought as a temporary stop-gap four years ago (didn't even build a full system, just CPU+mobo+RAM). If you compare overclocked speeds it's not much slower than a 6600k and it's not like I'm running into software it can't handle. I actually looked into an upgrade for Sky Lake, but there doesn't seem to be much point. I guess I have to wait for that PCIe SSD I wanted... It's not like the 840 PRO I already have isn't doing just fine anyway. Intel should have just added more cores, that would have given us a reason to upgrade. $300 Octo-core i7? Yes please.

I expect the GPU issue will be resolved with the next refresh, they can just add more operation units and be just fine, they don't need to keep scaling frequency. The only reason for that stall is the process problems at TSMC. As for me I'm actually using 2 280xs I picked up for Cryptocoin mining, they're no good for that anymore (I'm not complaining, they paid themselves off and more) but they're more than I need for 1080p so don't need to buy a new GPU either. Maybe they'll come up with something good when DX12 is common.
 
Oh, and btw... They are pissing me the hell off with all the socket changes that are totally unnecessary. Leave the fucking socket alone! We don't need new motherboards every damn year, and good one's are getting expensive !

I'd just like to upgrade the processor every now and then. Maybe.

They created this mess. Don't look at us.
I hate the socket wars too, but it's interesting that it only occurs in the commercial space. Intel keeps releasing new sockets for new chipsets with consumer facing features (Faster USB, SSD caching, fast boot / resume, thunderbolt, etc) and none of these features are for the enterprise segment. Instead in enterprise, you can see that 1366 and 2011 have pretty long lifespans, because they aren't adding any consumer features to that platform. Consumer stuff is bleeding edge and they need something other than cpu functionality to drive sales, in a socket that is as physically small as possible to make devices as small as possible as cyclical footprint reduction is a requirement from the OEMs making consumer products.

Then we have the whole RAM change, the RAM manufacturers want to move to DDR4 and stop manufacturing DDR3, so Intel is just meeting the roadmaps of its manufacturing partners. And DDR4's benefits aren't in the first generation, but the later ones with improved scalability. Nothing consumers care about if they're happy with 8GB per machine, but memory vendors want to sell 32, 64, and 128GB dimms just as content providers, advertisers and the NSA want Terabytes of RAM per machine to crunch our online metadata for correlation without crossing busses and networks as frequently.

Personally, I invested in both LGA1366 and LGA2011 and i'm getting great cheap upgrades every year. Recently 8GB DDR3 RDIMMS have gotten dirt cheap, and I regularly look for cheap Xeons for more cores / lower heat and cooling noise on Ebay without having to change motherboards too frequently.
 
Microsoft's entrance into the PC market was certainly driven in part by the slump in PC sales but there's more to it than that. Apple has iconic laptops and desktops, renowned for their engineering quality by many thus the price levels that Apple can charge. There was nothing like it in the PC world. Where was that iconic PC, that household name of a PC that was cool and lust worthy?

This has simply never existed in the PC world that's for so long been too full of cheap plastic junk. The number of new PCs sold will long term will probably settle in at a much lower number than the historic highs but I think PC makers can make more profit on the machines they do sell if they push the envelope with good designs and focus a little less on pushing large volumes of cheap plastic junk which they aren't going to sell as much of anyway in this day and age.
Historically, in the PC market, you're wrong.

It has been said before on this forum that Sony was building quality laptops in the late 90s and early 2000s stood apart from their competitors as "iconic laptops and desktops, renowned for their engineering quality," and people weren't buying them at "the price levels that Apple can charge." I saw this firsthand. Maybe Microsoft can manage now what Sony couldn't do back then.

Either way, when the famous software maker Microsoft steps in to make laptops and tablets, contrary to their entire history, it says something about the state of the PC.
 
but even today, I can run 2 AAA titles simultaneously on a Nehalem i7 (running a baremetal hypervisor and two high end video cards in passthrough at 1080p). I'm not quite convinced that VR requirements can't be met with just the current performance envelopes on the CPU side, and GPUs aren't tied too closely to the CPU generation (even current GPUs run close to full speed on pcie 2.0 x8). There's nothing really forcing consumers to upgrade over the next 2 cycles unless it's one of those revolutionary changes nobody can expect, or some kind of financial incentive.
But will it play Fallout 4?
 
But will it play Fallout 4?
Haha I'm looking forward to that, but my main gaming rig these days is a higher efficiency i5-3570 (non-k for vt-d support and MCE to hit 4ghz) however i've switched to dual 970s, and i haven't gotten NVIDIA passthrough quite working on those yet, it was a bit easier with AMD 7950s. But it's easy enough to find out, just disable half the cores and see if it does a reasonable job. I honestly stopped using virtualized windows for gaming once certain MMOs (like Tera) started detecting virtualized environments, presumably because people were multi-boxing. I think it's stupid to dictate what players are and aren't allowed to use as hardware, but I suppose someone may have found a way to build massive bot farms in VMs since you can modify the virtual peripherals and send inputs without the game being able to detect it. Now I need a separate rig rather than running my games off my server and my power bill suffers.
 
It's the trends and a new generation. These guys grew up with smart phones, and android this and that. Tablets. They think we're "wierd" because we care so much about clocking our cpu, memory, and gpu up. It was hardcore to get even 5-10 extra fps years back, spent hundreds doing so. Running that Quake timedemo, that 3dmark. Nobody does that now. It's just like the computer has become a gaming box, you just turn it on and play. Sadly, Xbox had a hand in this. There's really nothing to impress on the computer side to todays generation. Even though WE know the huge differences between a Xbox One or PS4 and a high end comp, they don't, nor do they probably care. Their Xbox One is hooked to a high definition screen and runs at 720p or if they're lucky, maybe 1080p reduced. I personally care about every little detail still to this day. I will forever tweak until I'm not able to anymore.
 
Haha I'm looking forward to that, but my main gaming rig these days is a higher efficiency i5-3570 (non-k for vt-d support and MCE to hit 4ghz) however i've switched to dual 970s, and i haven't gotten NVIDIA passthrough quite working on those yet, it was a bit easier with AMD 7950s. But it's easy enough to find out, just disable half the cores and see if it does a reasonable job. I honestly stopped using virtualized windows for gaming once certain MMOs (like Tera) started detecting virtualized environments, presumably because people were multi-boxing. I think it's stupid to dictate what players are and aren't allowed to use as hardware, but I suppose someone may have found a way to build massive bot farms in VMs since you can modify the virtual peripherals and send inputs without the game being able to detect it. Now I need a separate rig rather than running my games off my server and my power bill suffers.
That's awesome! I loved seeing that guy running GTA V virtualized on ArchLinux using QEMU, KVM, and PCIe passthrough.
 
my dual core celeron laptop from 2009 can barely handle the internet now.

Other than that, my AMD llano laptop from 2011 is still going strong, and my athlon x4 from 2009 also works fine.

I am contributing to this slump
 
Historically, in the PC market, you're wrong.

It has been said before on this forum that Sony was building quality laptops in the late 90s and early 2000s stood apart from their competitors as "iconic laptops and desktops, renowned for their engineering quality," and people weren't buying them at "the price levels that Apple can charge." I saw this firsthand. Maybe Microsoft can manage now what Sony couldn't do back then.

I'm 47, been around Windows PCs since almost Day One. I'm pretty familiar with Sony PCs and they made some great ones. And that made some plastic crap as well. In all my years with Windows PCs I've never seen the level of interest in a single Windows PC device as the Surface Book. Not even close.

Either way, when the famous software maker Microsoft steps in to make laptops and tablets, contrary to their entire history, it says something about the state of the PC.

Apple has long prided itself in delivering both the hardware and software and many times stated that by doing so it can do a better job of delivering a solid computing experience. Microsoft is no stranger to hardware but as its OEMs have long be it's best customers it didn't want to compete with them. Even at this point I don't think Microsoft still really wants to. But in this day and age people by devices, at least that's where they put there money as consumers. They don't buy software, certainly not software upgrades.

The Microsoft model of selling Windows licenses directly to consumers if all but done. And that will probably be the fate it faces with OEMs as well. You cannot sell an OS to consumers anymore. For all of the hate about Windows 8.x/10 about privacy issues and tablet apps and such, that's the reality of today. Virtually no consumer pays $200 for Windows licenses, people coming up with these numbers are making them up.

So Microsoft is trying to make some money with hardware to make up for revenue it's never going to see with Windows licenses and at the same time trying to create an iconic PC. The idea started out really rough, especially with the disaster of Windows RT and Surface RT. Now that the idea has been focused better towards a loyal and spendy PC niche, it's working much better.
 
That's awesome! I loved seeing that guy running GTA V virtualized on ArchLinux using QEMU, KVM, and PCIe passthrough.
The first time i had it running I thought Microsoft should build it into Windows an it would be such a killer app in a future version of Windows. They already have Hyper-V working since Windows 8, passing through the GPU to the primary VM, they just needed passthrough of a second card and a USB controller for the second screen. Would be great for the kids, and give Windows some gaming cred that no console could match.

I read an article on someone running steam on an Amazon GPU enabled instance, then using steam streaming to connect from home over an openVPN. He was estimating $.57 an hour using spot pricing which is quite reasonable. Process was pretty involved, but if it could be set up as a cloud formation template to automate the whole setup process it would be pretty cool way to get your non-gamer friends a taste of multiplayer goodness even if they only had an iGPU machine.
 
It's the trends and a new generation. These guys grew up with smart phones, and android this and that. Tablets. They think we're "wierd" because we care so much about clocking our cpu, memory, and gpu up. It was hardcore to get even 5-10 extra fps years back, spent hundreds doing so. Running that Quake timedemo, that 3dmark. Nobody does that now. It's just like the computer has become a gaming box, you just turn it on and play. Sadly, Xbox had a hand in this. There's really nothing to impress on the computer side to todays generation. Even though WE know the huge differences between a Xbox One or PS4 and a high end comp, they don't, nor do they probably care. Their Xbox One is hooked to a high definition screen and runs at 720p or if they're lucky, maybe 1080p reduced. I personally care about every little detail still to this day. I will forever tweak until I'm not able to anymore.

This is certainly a generational issue. Most of the issues with Windows 8.x and complaints about 10 are about different expectations of what computing is about. At my age I grew up tweaking this and that and whatever and that's just not what computing is about today. Computing devices are go to tools that need to work and be simple to use. Local files, installing .exes, scripts to optimize this or that, that's not going to appeal to many folks.

And that's much of the problem with Windows. People love to proclaim the greatness of Windows 7. Not nearly as many would care about it in 2015 like they did in 2009. Computing years are like dog years and Windows 7 is now 42 years old. Of course there are other concerns, businesses, folks that don't care or don't want or need anything new. But the "what I have is fine and don't need or want anything new" is exactly a market for much.
 
That there are now Microsoft branded tablets, phones, and a laptop is proof that the PC hit rock bottom.

Is this bottom enough for the PC to admit that it has a problem though or will it have to find new bottoms?

Meaning, they finally produced something that was worth my money to spend on. (Surface Pro, HTC One M8 for Windows Phone.) For those who do not think the Surface Pro is a good device, they have clearly not used one or taken the time to look outside their little box.
 
This is certainly a generational issue. Most of the issues with Windows 8.x and complaints about 10 are about different expectations of what computing is about. At my age I grew up tweaking this and that and whatever and that's just not what computing is about today. Computing devices are go to tools that need to work and be simple to use. Local files, installing .exes, scripts to optimize this or that, that's not going to appeal to many folks.

And that's much of the problem with Windows. People love to proclaim the greatness of Windows 7. Not nearly as many would care about it in 2015 like they did in 2009. Computing years are like dog years and Windows 7 is now 42 years old. Of course there are other concerns, businesses, folks that don't care or don't want or need anything new. But the "what I have is fine and don't need or want anything new" is exactly a market for much.

Agreed, and 7 looks so old now it's crazy, yet I still find it compelling on so many fronts. Windows 10 is like a shinier copy no doubt, but the whole windows look is definitely behind the times.
 
I'm 47, been around Windows PCs since almost Day One. I'm pretty familiar with Sony PCs and they made some great ones. And that made some plastic crap as well. In all my years with Windows PCs I've never seen the level of interest in a single Windows PC device as the Surface Book. Not even close.



Apple has long prided itself in delivering both the hardware and software and many times stated that by doing so it can do a better job of delivering a solid computing experience. Microsoft is no stranger to hardware but as its OEMs have long be it's best customers it didn't want to compete with them. Even at this point I don't think Microsoft still really wants to. But in this day and age people by devices, at least that's where they put there money as consumers. They don't buy software, certainly not software upgrades.

The Microsoft model of selling Windows licenses directly to consumers if all but done. And that will probably be the fate it faces with OEMs as well. You cannot sell an OS to consumers anymore. For all of the hate about Windows 8.x/10 about privacy issues and tablet apps and such, that's the reality of today. Virtually no consumer pays $200 for Windows licenses, people coming up with these numbers are making them up.

So Microsoft is trying to make some money with hardware to make up for revenue it's never going to see with Windows licenses and at the same time trying to create an iconic PC. The idea started out really rough, especially with the disaster of Windows RT and Surface RT. Now that the idea has been focused better towards a loyal and spendy PC niche, it's working much better.
Sony's domestic (Japanese) laptops have always been awesome, I think most of the thick plastic ones we associate with "crap" Sony were US domestic market models. I used to visit Akihabara every year from the late 90s to the early 2000s and they had only the most stylish and innovative models in the stores like Bic Camera, Tzone and Laox (the same could be said for Fujitsu and Toshiba, I actually owned about 3 generation of librettos which were really unique at the time). Japanese computing was always expensive, in fact prior to the whole Dos/V revolution, NEC had a stranglehold on the market and you had to pay to play. Until the mid 2000s, computers were extremely expensive, premium and differentiated products with no budget / entry level products and Sony was had a similar positioning as Apple did in Japan. Steve Jobs was a huge Sony fan, and even offered Sony a chance to sell VAIOs with Mac OS. Instead, after Sony passed on it, he hired away some of their best designers which was kind of the beginning of the end. Sony US has always been a different beast.
 
I used to build a new machine every 6-12 months to keep up with performance. I haven't built one in 3 years now. Why bother? The advances in processor speed/performance is totally meh over the last 3 years. Hell, the original i7-920 is still more than fast enough for a typical desktop!

My main desktop is a 2600K, The game machine is a 3770K. Desktop has a 7970 and the Game machine has a 290x.

Don't need new machines. The processor is far from being the bottleneck any more. A new SSD here, a new Graphics Card there, and a machine can go a good 5 years plus. Even for us Geeks.

And I sure as hell don't see the need/cost of DDR-4.

This is really the telling point. A LOT of us are like this now. Used to build new PC's like it was nothing. Now its easy to go 3-5 years even, and only upgrade a the video card at most, and still be able to play everything thats out there at high or close to highest settings. Nothing is really pushing the Industry, at least from a gamers point of view like they used it. Instead of upgrading or building new PC's to be able to even play new games, its not building/upgrading PC's to win an e-peen most frames per second on new games, when acceptable is far below what a high end pc offers. Just not worth it to most of us these days. I'll upgrade when I need to.
 
Windows 10 is like a shinier copy no doubt, but the whole windows look is definitely behind the times.

The problem with the whole Windows look, on the x86 side at least, steams from this generational issue. Never have seen a kid have a problem with a Windows 8.x/10 tablet or complain about the UI. For them it's about, run apps, run apps and then run some more apps. There's where Windows is really behind the times.

The future of Windows on the consumer side is not really about Win32 desktop apps and heavy desktops and keyboards and mice. Some of it is but not on the consumer side. Light, portable, touch, battery efficient and apps, apps, apps and did I say apps or lack of will be its destiny with consumers.
 
Back
Top