Why PCs Are "Failing"

I agree

1080p is so 2007. People should be running 2560x1440 on a PC or higher with at least 120-144hz

Yeah but how many can actually play today's games at 1080p full quality at a steady 120 fps? Everybody is reaching for higher resolutions but I want my games to be smooth as butter. Obviously I'm in the minority in that regard.
 
Desktop PC's are becoming more of a professional tool and thats not a bad thing at all. Most people just need a smartphone, tablet or laptop and thats perfect for them.
There will always be the need for the latest tech for professionals.
 
Firstly the PC never needed to be fast for majority of the people out there. To check Facebook, Email, and type documents you could still be using a Pentium 4. Games have always been the driving force of the PC. The same goes for the Tablet. That all came to an end when the Xbox 360 and PS3 was released and every game made since has focused around the hardware of consoles.

Other than that there's.

#1 Windows 10 sucks. Updates and installs whatever pleases Microsoft. Removes applications cause it angered the Microsoft gods.

#2 Internet bandwidth caps and limited speed. Even my cell phone is effected by these caps. Ask me if I care if my phone has 3.5G, 4G, or LTE? Fast enough, because even at 3.5G speeds I can suck up my data cap in a week.

#3 Lack of faster hardware. A 2500k is as fast as a 6600k when properly overclocked. It isn't worth to spend money to upgrade your hardware.

I think the AM1 cpus (25w) and the newer celerons (10w) are perfect for email machines. I recently built and "email/web browsing" machine with an AM1 cpu + motherboard for $30+tax and added an 8G stick of ram and threw on ubuntu and it's perfectly functional for that. I recycled a 64G ssd, but 120G ones on sale are $32 (seen one at amazon).

#1 is subjective. I am kind of meh with win10 as it+crimson brought me video bugs with my R9-290.
#2 Very true. I bought a commercial connection to avoid this. IMHO more important than a higher res monitor

#3 Preaching to the choir. Most people can't tell the difference between a core 2 & a an i3/5/7-6kfor day to day tasks, hence
 
With this economy it's hard to justify spending $500 every time a new GPU generation pops up.
It makes better sense to buy a console for that price and play games that are optimized for it instead.
I blame the duopoly of AMD and NVidia for this. Without more player on the field they are keeping the GPU prices artificially high.

Hell, for the price of a pair of 980Ti you can get all three console presently available plus a couple of game and accessory for each. This is what I did and I still haven't regret it.
 
Yup, since most of the big games are console ports and I'm mostly a Linux user on the PC side of things, buying a PS4 or XBO has kinda been something I considered too with respect to playing games though I really don't play enough games to actually justify buying one. More importantly, I despise having to add some stupid dedicated card into an expansion slot just to play the occasional game because it's so inelegant. Even making room for a PCI-E expansion slot requires a larger case and higher than necessary wattage power supply. Sure, like adding RAM or having a removable hard drive does impose some space requirements considerations too, but the huge, hot, noisy GPU thing goes beyond reason. A computer should be something that uses like 10 watts of power when it's working really hard, has no moving internal parts or cooling fans, and can run for about 8 hours without being plugged into a power outlet and can be easily folded shut and carried around. PC manufacturers, in a lot of ways, haven't figured that out yet and they're still trying to sell business-based hardware like desktops and workstations to home users when there's very little relevant crossover.
 
With this economy it's hard to justify spending $500 every time a new GPU generation pops up.
It makes better sense to buy a console for that price and play games that are optimized for it instead.
I blame the duopoly of AMD and NVidia for this. Without more player on the field they are keeping the GPU prices artificially high.

Hell, for the price of a pair of 980Ti you can get all three console presently available plus a couple of game and accessory for each. This is what I did and I still haven't regret it.

Yup, pains me to say it, but I have to agree. The prices are just disproportionate for what you get.
 
unless we move into some form of intensive home automation program, pc's are faster and last longer than ever before ... I am getting ready to buy a custom system for several thousand dollars but I expect it to last me for 6-8 years ... replacing PCs every 2-3 years is long gone ... most of my employers have had laptops on 5-6 year refresh cycles so PCs power and durability are their biggest enemies right now

I disagree.

Games may not push hardware but that doesn't mean that software, particularly OS software, will not require new fresh platforms. It's never been the app that drives the changes, it's the enabling operating systems that have always been the real drivers, that and the adoption of desirable innovation like the AGP Bus, the USB Bus, the PCI Express Bus, SATA II, etc, etc.

A Processor alone does not a computer make.

Ten years ago I told people that we could already make computers pretty damn tiny and that the major push toward miniaturization was mostly over and that the real innovation would center around communications and how we interact with computers. The largest sign that I was correct was the birth of the smart phone and the way business is not only embracing it's use but incorporating the smart phone into our daily lives.

We have not really gone as far as we can go when it comes to integrating communications and computing into our life. We are only at the early stages, a clunky half-assed rush and we are seeing that some things are terrific ideas that still need work and other ideas are just terrible and don't even warrant further development.

When access to information and communications become so smoothly integrated that they are perceived as smoothly as our own senses, then we will have reached a real pinnacle, until then, the race will continue.
 
With this economy it's hard to justify spending $500 every time a new GPU generation pops up.
It makes better sense to buy a console for that price and play games that are optimized for it instead.
I blame the duopoly of AMD and NVidia for this. Without more player on the field they are keeping the GPU prices artificially high.

Hell, for the price of a pair of 980Ti you can get all three console presently available plus a couple of game and accessory for each. This is what I did and I still haven't regret it.

I disagree, to me consoles are pale comparisons to the best gaming platform which remains the PC. Just because many titles are ports it's actually incorrect to say that most are. Yes, many popular titles are ports, or more accurately, parallel developments which is different then a port. Fallout 4 was not a port, it was a product that was developed for multiple platforms in parallel allowing for as much common code as possible.

At the same time, there are still way too many great PC titles that will never see play on a console. These are not the most popular titles because not everyone goes for this level of gaming, particularly with simulations, but they do represent a solid enough following that they have an effect on the future of the PC.

Take the Arma series for example, the game has never been released for consoles and it won't be, consoles don't have the power to run it. But the real discriminator between consoles and PCs has never been about anything other then the controller, the keyboard, and the mouse. Although you can get a keyboard and mouse for a console, console developers have refused to develop the user interface for it. Because of this, the method that the user interfaces with the game application limits the title and creates a barrier that's actually artificial. If the console had adopted the keyboard and mouse and leveraged it's use in titles that called for it, the debate over PCs and Consoles would have died years ago because at their core, a console really is just a PC.
 
I disagree, to me consoles are pale comparisons to the best gaming platform which remains the PC.

Nooooope, disagree with your disagree mostly because "best" is totally an opinion thing-y. PCs have more compute power and storage than a console and using one to play games might be best for you (and me right now since any games I do play are on a PC), but that's not true for everyone. Lots and lots of people would say that consoles are best for their fun times or that a phone is the best thing to use to play games.

Either way, it doesn't matter that much what does the game stuff better since running a game is only a secondary thing that computers in all their forms (console, tablet, watch, PC, whatever) are asked to do. In fact, since it's like hobby/entertainment, it's one of the LEAST important or relevant considerations when computers are considered for their usefulness. It's like, barrel-bottom on the list of stuff that matters.
 
I disagree.

Games may not push hardware but that doesn't mean that software, particularly OS software, will not require new fresh platforms. It's never been the app that drives the changes, it's the enabling operating systems that have always been the real drivers, that and the adoption of desirable innovation like the AGP Bus, the USB Bus, the PCI Express Bus, SATA II, etc, etc.

A Processor alone does not a computer make.

Ten years ago I told people that we could already make computers pretty damn tiny and that the major push toward miniaturization was mostly over and that the real innovation would center around communications and how we interact with computers. The largest sign that I was correct was the birth of the smart phone and the way business is not only embracing it's use but incorporating the smart phone into our daily lives.

We have not really gone as far as we can go when it comes to integrating communications and computing into our life. We are only at the early stages, a clunky half-assed rush and we are seeing that some things are terrific ideas that still need work and other ideas are just terrible and don't even warrant further development.

When access to information and communications become so smoothly integrated that they are perceived as smoothly as our own senses, then we will have reached a real pinnacle, until then, the race will continue.

I am not saying that we can't have a future resurgence but with the current market you have two primary purchasers of non-server computing devices:

- Consumers who are extremely cost-centric and mobility focused. They will purchase laptop and tablet computers and use them until they break or don't perform basic functions anymore. The average consumer will want to spend no more than $1000 (and typically a lot less than that). They are not performance driven and will purchase mostly on price and mobility.

- Enterprise who are also cost-centric and mobility focused. They will purchase either dumb terminals (for desktops) or laptops for their more mobile employees. They will not be driven primarily by OS due to compatibility and cost (most of my employers were at least one generation back, sometimes two generations). Since they can write off IT related expenditures over time they will upgrade hardware when it breaks or on a 4-6 year refresh (depending on the depreciation and surplussing strategy they follow)

Power users still exist but we are a small subset of the market (single digit millions in a market of hundreds of millions). I am getting ready to drop 4K on a new computer but due to the cost and integrated water cooled components I am less likely to want to upgrade hardware (beyond upgrading drives and adding a second GPU) for several years minimum
 
No benefit to go from 1080p to 1440p? Sounds more like you are talking yourself into being happy with 1080p. Saying there is no benefit to higher resolution for you is pretty bullshit.

No, it's not bullshit at all. I also have chosen to stay at 1080p. I have a pair of ASUS 27" monitors that give me all the real-estate I need to display a great looking game and run other supporting apps like TeamSpeak and an open browser for research or display information related to the game I am playing. Making this choice was a huge money saver for me as it limits what I must do to support the displays, my games run great on a 4GB ASUS STRIX GTX 960 and I am so happy with my choice.

You can say that his statement is bullshit, but the problem is that you really don't understand his statement. What his statement really means is that the benefit is simply not worth the costs for many people, he just said it a little differently :D

And BTW, it really does cost a lot to push past 1080p, 2K gaming probably adds a good $400 in graphics cards and monitor costs and 4K gaming is a good $1,000 investment which is the entire cost of my last rebuild. Some can't afford it, some can but have better places for the money, and some can afford it and just choose not to because it actually will effect their current systems and all their future systems. Not many will make the jump to 4K and go back to 1080p cause they have too many components already supporting 4K.

Maybe later 4K will be more the standard and not so expensive, but I still can't see myself connecting to 65" displays on the other side of the room to game on. Perhaps one of those bubble shaped projector rigs that simulators use, that would push my button but that stuff is even more expensive.
 
Has anyone looked at how VR will ramp up the cpu/gpu market all over again?

I haven't seen a real improvement over VR other then it's prettier, cheaper, lighter, but the interface is still wonky to me. Perhaps if gesture capabilities improve and are integrated extremely well. Oh, and they deal with motion sickness better then they have in the past. It's not VR that has my interest, it's the display immersive technology but I still have the same problems, how to see my coffee cup, keyboard, etc.
 
Power users still exist but we are a small subset of the market (single digit millions in a market of hundreds of millions). I am getting ready to drop 4K on a new computer but due to the cost and integrated water cooled components I am less likely to want to upgrade hardware (beyond upgrading drives and adding a second GPU) for several years minimum

It sounds like you're at a crossroads in the whole new computer thing and have a chance to really change your computing model if you didn't want to spend what's basically the cost of a year's entire wardrobe for most middle class people. Try this instead of the water cooling and whatever thing you're thinking about and put the extra money into something fun:

http://www.amazon.com/HP-Stream-11-6-Inch-Celeron-Personal/dp/B015CQ8OUY/
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
No, it's not bullshit at all. I also have chosen to stay at 1080p. I have a pair of ASUS 27" monitors that give me all the real-estate I need to display a great looking game and run other supporting apps like TeamSpeak and an open browser for research or display information related to the game I am playing. Making this choice was a huge money saver for me as it limits what I must do to support the displays, my games run great on a 4GB ASUS STRIX GTX 960 and I am so happy with my choice.

You can say that his statement is bullshit, but the problem is that you really don't understand his statement. What his statement really means is that the benefit is simply not worth the costs for many people, he just said it a little differently :D

And BTW, it really does cost a lot to push past 1080p, 2K gaming probably adds a good $400 in graphics cards and monitor costs and 4K gaming is a good $1,000 investment which is the entire cost of my last rebuild. Some can't afford it, some can but have better places for the money, and some can afford it and just choose not to because it actually will effect their current systems and all their future systems. Not many will make the jump to 4K and go back to 1080p cause they have too many components already supporting 4K.

Maybe later 4K will be more the standard and not so expensive, but I still can't see myself connecting to 65" displays on the other side of the room to game on. Perhaps one of those bubble shaped projector rigs that simulators use, that would push my button but that stuff is even more expensive.

Nothing wrong with chosing to stay at a lower resolution. It will save you on constantly having to have bleeding edge GPUs to push high settings in games. I stuck with 1680x1050 last gen when most people were moving to 1080p or 1200p. I got through the entire generation (2007 to late 2013) with just one GPU upgrade from an 8800GTS to a GTX 460.
 
I agree. I tried to hype myself up with Skylake, m.2, DDR4 and justify and upgrade, but then I look at the big picture and realize it's simply not logical. However, that might change with DirectX 12 which can take more advantage of CPU power and I hope it does.

Same. Except my son is trying to really hype it up to me lately. He's got an old i5 that is getting long in the tooth (and an AMD 7770, which is not aging well with his games). My i7 2600K is just fine for me. I'd love to upgrade, but I can't justify it. But, he wants to upgrade his computer with my parts. So, I might get an upgrade myself and pass mine down to him. At least it gives me a little more reason for an upgrade.
 
Same. Except my son is trying to really hype it up to me lately. He's got an old i5 that is getting long in the tooth (and an AMD 7770, which is not aging well with his games). My i7 2600K is just fine for me. I'd love to upgrade, but I can't justify it. But, he wants to upgrade his computer with my parts. So, I might get an upgrade myself and pass mine down to him. At least it gives me a little more reason for an upgrade.

And a great way to get him involved with learning how to deal with computer hardware all on his own. I did the same with my girls. One decided to learn and she got all the good stuff from my hand-me-downs. Now she's not afraid of anything when it comes to computers, operating systems, etc. She isn't enterprise level IT, but she isn't helpless by far.
 
I haven't seen a real improvement over VR other then it's prettier, cheaper, lighter, but the interface is still wonky to me. Perhaps if gesture capabilities improve and are integrated extremely well. Oh, and they deal with motion sickness better then they have in the past. It's not VR that has my interest, it's the display immersive technology but I still have the same problems, how to see my coffee cup, keyboard, etc.

As much hate as the Kinect gets, I think it'd make a good companion device to VR. It'd see where your body parts were and put those in game, so gestures would be easy.

Motion sickness is something they need to fix. I'm fine with VR, but many aren't. Hell, my Dad's friend fell on the floor years ago just watching him play Decent 3. So, 3D and VR affect people differently. Hard to fix that, when some people have no issue and others are extremely sensitive to it.

VR is one reason I'd go for a PC upgrade. If Oculus was available and worked great, but required Skylake and a high end GPU, I'd upgrade very soon. I don't want to upgrade now, and then have to upgrade again a year from now when it's released. I'll upgrade and get the best possible quality I can at the time. Why upgrade now for something that's not released yet? VR is going to be my driving factor (for now) for my next upgrade.
 
And a great way to get him involved with learning how to deal with computer hardware all on his own. I did the same with my girls. One decided to learn and she got all the good stuff from my hand-me-downs. Now she's not afraid of anything when it comes to computers, operating systems, etc. She isn't enterprise level IT, but she isn't helpless by far.

My oldest loves playing games. He knows what does what, but he doesn't go beyond that very often.

My youngest loves computers and games. He'll replace parts, install stuff, so what it takes to make his computer faster and more stable. He's not IT level yet, either, but he's got the mindset to go into it when he's older. When my oldest asks for help, I tell him to ask his little brother. Usually, that'll get him to try and fix it himself (competition!) or actually ask his brother.

I definitely encourage them to learn more and experiment and just play with stuff. It can all be fixed if something is messed up. They aren't doing too bad.

How old are your girls? I thought you were an older guy.... :D
 
geez seems like another spreadsheet vs speed article from years past.
There is no need to "offer an explanation as to what the root cause is for PC growth slow down"
Everybody knows what the "root cause is."
Welcome to what it feels like to experience the natural progression of Moore's Law.
In the meantime progression means huge curved 4K hardware synced monitors, the graphics cards needed to drive and the pocket books to pay for them
In the past a premium setup from the ground up would cost about 2500. Today its about 4000
Technology is progressing very quickly in ways that go beyond the core (cpu, memory & system board)
 
Desktop PC's are becoming more of a professional tool and thats not a bad thing at all. Most people just need a smartphone, tablet or laptop and thats perfect for them.
There will always be the need for the latest tech for professionals.

The reason I will always need a pc for non gaming use is that I need to type a lot to troll forums, and tablets/phones are not ideal for that for me. Maybe the younger crowd sees that differently.
 
I disagree, to me consoles are pale comparisons to the best gaming platform which remains the PC. Just because many titles are ports it's actually incorrect to say that most are. Yes, many popular titles are ports, or more accurately, parallel developments which is different then a port. Fallout 4 was not a port, it was a product that was developed for multiple platforms in parallel allowing for as much common code as possible.

At the same time, there are still way too many great PC titles that will never see play on a console. These are not the most popular titles because not everyone goes for this level of gaming, particularly with simulations, but they do represent a solid enough following that they have an effect on the future of the PC.

Take the Arma series for example, the game has never been released for consoles and it won't be, consoles don't have the power to run it. But the real discriminator between consoles and PCs has never been about anything other then the controller, the keyboard, and the mouse. Although you can get a keyboard and mouse for a console, console developers have refused to develop the user interface for it. Because of this, the method that the user interfaces with the game application limits the title and creates a barrier that's actually artificial. If the console had adopted the keyboard and mouse and leveraged it's use in titles that called for it, the debate over PCs and Consoles would have died years ago because at their core, a console really is just a PC.

But you have to admit that the cost of entry on PC is way more than for a console. Your example of Arma is a good example of that. It would cost more than the price of a console to buy a GPU able to run it optimaly without adding the cost of the rest of the PC and that is for only one game today. Tomorrow another game may need the next gen GPU to run optimaly and so on, while on console the games are optimal for the plateform over all of its life cycle which is around 7 years now.

The keyboard/mouse thing is way overblown. You think it is best because that's what you're used to and confortable with now. Peole have been playing with one form or another of gamepad since the mid 80's and joystick before that and they had just as much fun as you do. What matters is that all the player in the game use the same input device to evel the playing field. If you need help with how to play fps with a gamepad, just do a search on youtube for it. There are quite a bunch of tutorial and how to excel at FPS with a gamepad.
 
It sounds like you're at a crossroads in the whole new computer thing and have a chance to really change your computing model if you didn't want to spend what's basically the cost of a year's entire wardrobe for most middle class people. Try this instead of the water cooling and whatever thing you're thinking about and put the extra money into something fun:

http://www.amazon.com/HP-Stream-11-6-Inch-Celeron-Personal/dp/B015CQ8OUY/

Get thee behind me Satan (or Skibblekat as the case may be) :p
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Yup, pains me to say it, but I have to agree. The prices are just disproportionate for what you get.

That's a problem because Intel AMD and Nvidia know that they can't make the hardware any faster, and it doesn't matter cause the games refuse to use the hardware. Why you think AMD has re-branded their 7000's to R9 200's to R9 300's? You're using 3 year old hardware and don't even know it.

With this economy it's hard to justify spending $500 every time a new GPU generation pops up.
It makes better sense to buy a console for that price and play games that are optimized for it instead.
I blame the duopoly of AMD and NVidia for this. Without more player on the field they are keeping the GPU prices artificially high.

Hell, for the price of a pair of 980Ti you can get all three console presently available plus a couple of game and accessory for each. This is what I did and I still haven't regret it.
Except that a 980 Ti is ridiculous in price performance. A regular 980 also doesn't make sense. There's a reason why the 970 is the most popular GPU on the market, because the 980 is only 10% faster than the 970 while the 980 Ti is only 10% faster than that. So basically after the 970 your return in value drops faster a deuce.

And don't think we've hit a brick wall in gaming either. Playing Fallout 4 I realized that we aren't utilizing physics in games. Yes it all looks impressive but when a Death Claw turns around and plays hide and seek because I'm in a wooden building and it can't get to me. You know you've dun fucked up. Why isn't the Death Claw wrecking a wooden hut that I'm sure a strong gust of wind could knock over? Cause no physics. Because consoles couldn't handle the physics on the CPU. Meanwhile our quad core or 8 core CPUs aren't utilized very well at all.

#1 is subjective. I am kind of meh with win10 as it+crimson brought me video bugs with my R9-290.

Subjective my ass. Windows 10 did an update in the middle of the day that took an hour. When it finally booted up it removed CPU-Z and changed my default applications to whatever Windows 10 wants. Meanwhile another PC I have went to Windows 10 without me even knowing about it. It just upgraded itself without me even aware of it. How in the? When did it? Why?

As a PC gamer I literally have no choice but to run Windows. I'd run Linux, I'd even run Mac OS X if it could play all my games.
 
Except that a 980 Ti is ridiculous in price performance. A regular 980 also doesn't make sense. There's a reason why the 970 is the most popular GPU on the market, because the 980 is only 10% faster than the 970 while the 980 Ti is only 10% faster than that. So basically after the 970 your return in value drops faster a deuce.

In Canada, a GTX970 cost between $425 and $500 before tax. That is $100 to $200 more than a console by itself.

If you factor the rest of the components you end up with a machine that cost two to three times or more the cost of a console. Yes the games are better looking on a PC if you spend the cash but the point is that this "better looking" comes at a price which many people today aren't ready to pay just to play games.
 
Get thee behind me Satan (or Skibblekat as the case may be) :p

What's wrong with that suggestion? Those are really nice little computers.

Trolling aside, I'm totally taking my own advice even. Yesterday evening, I got my desktop ready for donation by cleaning it up and loading a fresh copy of Mint on it so I can give it to my local animal shelter (they sell stuff like that to make money to help buy kitty food and litter and they're a no kill shelter which is super important too). I'm going to be using a replacement Asus EeeBook x205ta though GRR at Asus for making it such a pain to install Linux so I'm stuck with Windows 8.1, but I'll still have my older Eee on Mint as my primary computer.

The desktop thing is a huge waste of space, energy, and money. Nevermind that you can't carry it anywhere or use it without being stuck plugged into a power outlet. In retrospect, I totally should have gotten an HP Stream since they're apparently a million times less painful to load with Linux. Maybe in a few months, I'll go get one and give the EeeBook to the kitty shelter. I mean it's okay. I've had it for a few days now and it does everything a Windows computer should do EXCEPT become a nice Linux box. There's like a 33 page long thread on the Ubuntu forums with people struggling to get full hardware support and work around dumb decisions by Asus like using a 32-bit UEFI BIOS (means you need a 32-bit EFI boot loader to get a 64-bit distro working or you have to resort to a 32-bit distro) and a really off-the-wall model sound card among a bunch of other things like poor touchpad choices and an iffy wifi adapter.
 
As a PC gamer I literally have no choice but to run Windows. I'd run Linux, I'd even run Mac OS X if it could play all my games.

There's lots of games available for Linux (maybe OS X too, but I can't speak from experience for Apple's platform). The big thing is getting over the idea of not having support for some games you might really like and don't want to move on from. Like, despite my aversion to FPS type stuff, I like Fallout. In fact, I liked most of Bethesda's fantasy stuff too like Skyrim and that was FOREVER holding me back from just walking completely away from Windows as being necessary for gaming. In retrospect, it was kinda a weird emotional attachment to the idea of playing them even though I hardly ever actually played those games.

So a few games work in WINE and Play On Linux. I've even tried some really ancient DOS-based stuff in DOSBox which is kinda fun. The DOS stuff works really well when you're limited by your old Atom n270 processor like I am until I can buy a more modern Bay/Cherry Trail system that is Linux friendly. Either way, I'd suggest looking at the categories of games you like to classify them by type and then see if those categories are available under Linux. Like for me, it's open world post apocalypse stuff, open world fantasy stuff, visual novels, and (I shamefully admit) some kind of space trading thing. On Windows, that was Fallout, Skyrim, and the X series stuff with visual novels being more of a once and done thing since they're more or less passive reader things you don't hold onto for long. On Linux, it's NeoScavenger, The Unreal World, some random thing I found online called Frontier First Encounters that I run in DOSBox and there's actually a pretty good selection of VN games on Steam that are either free or under $10 that work on Linux. I might also try some 2d JRPG stuff for something different. That's just the tip of the Linux gaming iceberg too since I'm really slowed down by my hardware's limitations and even still, there's more hours of fun that I'd ever have the time to spend playing games on the other side of computer operating systems. It makes a jump to Linux a great way to explore games you'd otherwise never have even tried like that Frontier thing I found on the internet.
 
The industry is just settling down like many other industries have. Saying that PCs are failing is like saying that refrigerators or cars are failing... except that almost everyone has one and almost everyone uses one. The only thing that changed is that people no longer have to buy a new one every 18-36 months in order to be able to run new programs. Instead, people now buy PCs and essentially run them until they stop working one way or another - which is exactly what people do with products from just about every other industry. That in no way means anything is "failing".

Even with mobile making inroads into the PC market, that still doesn't mean PCs are dying anymore than the car market is dying because some choose to ride a bike.

This is true but it applies primarily to the developed world. I think the cries of dead are people who look at sales patterns globally. Smart phones sell everywhere so its expansion phase is still in bloom, although pretty crappy ones depending where you are. So, they show to be expanding all the time. And if you're not expanding, you're dead by the financial analyst logic. the problem with PC's is that they require power and in places where power is not reliable a desktop would be a pain, even a laptop with a 3 hour life. Whereas the phone will last most of the day and is multi-use. I think the smartphone displaced the PC in the undeveloped world.
 
I am not saying that we can't have a future resurgence but with the current market you have two primary purchasers of non-server computing devices:

- Consumers who are extremely cost-centric and mobility focused. They will purchase laptop and tablet computers and use them until they break or don't perform basic functions anymore. The average consumer will want to spend no more than $1000 (and typically a lot less than that). They are not performance driven and will purchase mostly on price and mobility.

- Enterprise who are also cost-centric and mobility focused. They will purchase either dumb terminals (for desktops) or laptops for their more mobile employees. They will not be driven primarily by OS due to compatibility and cost (most of my employers were at least one generation back, sometimes two generations). Since they can write off IT related expenditures over time they will upgrade hardware when it breaks or on a 4-6 year refresh (depending on the depreciation and surplussing strategy they follow)

Power users still exist but we are a small subset of the market (single digit millions in a market of hundreds of millions). I am getting ready to drop 4K on a new computer but due to the cost and integrated water cooled components I am less likely to want to upgrade hardware (beyond upgrading drives and adding a second GPU) for several years minimum

You missed a big one, the Local, State, and Federal Governments. These users are similar to the Enterprise with a little less emphasis on mobility, and a much different perspective on costs. For instance, they can be so driven by lowest bid sourcing that they make stupid purchases that fail to satisfy requirements, delay development completion milestones, and generally add more costs that can exceed the initial savings. I have seen expensive equipment purchased and either sit in storage rooms for years or get installed into server racks, powered, but not actually used for anything at all, just drawing power and creating more heat that has to be cooled. You know why sometimes it will sit in a rack doing nothing? It's cause the organization that owns it doesn't want to leave it looking unused because someone might take it from them and they might want it later. I know of very current hardware that is just sitting using up juice, and a maintenance agreement which alone costs tens of thousands to cover spares and onsite service, and non-returnable hard drive warranties.

Wasteful, god yes, necessary maybe, unfortunately I think it's just the nature of the beast. I know of a full SAN stack of NetApp equipment that is sitting becoming more and more dated because the organization that owns it decided the maintenance warranties and license renewals would cost too much. They forgot that I had set the systems up so they could run them at 1/3 the cost. I work on a different contract now, they want to dump the equipment, we want to get it. A transfer would take about an hour of total time signing documents by three people, it's been three months and it still hasn't happened.

Just another reason why you want the lowest echelon of government doing everything they can and the next higher only what the lower ones can't handle. The higher it goes, the greater the overhead and potential for waste and abuse.
 
... Like, despite my aversion to FPS type stuff, I like Fallout. In fact, I liked most of Bethesda's fantasy stuff too like Skyrim and that was FOREVER holding me back from just walking completely away from Windows as being necessary for gaming.....

You have an odd concept of what an FPS style game is cause Bethesda has never produced one. Bethesda's titles are Single-Player Adventure RPGs.

Another issue with alternate OSs and the limitations in game availability for them is that many gamers enjoy the social aspect of multiplayer gaming whether it's FPS, MMO, or some other form, it's the interaction with other people that they enjoy most. Given that so many Multiplayer games are designed to use Servers to run the game world, only reasonably current titles remain supported. Also, gaming communities are fickle and migrate from game to game. Multiplayer On-Line gamers are nomadic and thus only current titles drive them, and only Windows really provides the OS needed.
 
You have an odd concept of what an FPS style game is cause Bethesda has never produced one. Bethesda's titles are Single-Player Adventure RPGs.

They're in a first-person perspective and you run around shooting stuff. *shrug* I guess there's a storyline too, but that doesn't really change it much as far as I care anywho. It's just splitting hairs to call it something else.

Another issue with alternate OSs and the limitations in game availability for them is that many gamers enjoy the social aspect of multiplayer gaming whether it's FPS, MMO, or some other form, it's the interaction with other people that they enjoy most. Given that so many Multiplayer games are designed to use Servers to run the game world, only reasonably current titles remain supported. Also, gaming communities are fickle and migrate from game to game. Multiplayer On-Line gamers are nomadic and thus only current titles drive them, and only Windows really provides the OS needed.

That's probably true, but I didn't think most people who played games were really that interested in being social with other people. I mean I was seeing that MMO stuff is sorta going away and there's only a small number of people that do multi player shoot-y stuff since none of those are very fun because of voice chat profanity kids and people who spend thousands of dollars to buy the really expensive microtransaction stuff to kill everyone else.
 
The hardware quality and performance has improved immeasurably.

The software quality and performance has nosedived immeasurably.

If the games houses actually spent 5% of their $100 million budget on actually QA testing the product for at least 3-4 months before release rather than a week before or just not even bothering, we could all be running Fallout 4/Witcher 3 at 1080p/High settings on 2009 hardware.

Bad lazy code is the norm nowadays. No one wants to properly test software anymore.

Coders....have some pride in your work!

Project managers...properly and realistically plan your projects so the coders can do their job properly!
 
They're in a first-person perspective and you run around shooting stuff. *shrug* I guess there's a storyline too, but that doesn't really change it much as far as I care anywho. It's just splitting hairs to call it something else.



That's probably true, but I didn't think most people who played games were really that interested in being social with other people. I mean I was seeing that MMO stuff is sorta going away and there's only a small number of people that do multi player shoot-y stuff since none of those are very fun because of voice chat profanity kids and people who spend thousands of dollars to buy the really expensive microtransaction stuff to kill everyone else.

They are also in the Third Person perspective, can never split enough hairs you know ;)

That is solved when you play games and join clans who are a little more picky about the maturity of their members. Annoying vulgar drama queens types are usually shown the digital door. Many games like World of Tanks, Warships, Mechwarrior On-Line, etc, all support Clans/Teams and some teams are mature, some are not, some place competitive gaming prowess above social graces, there is a place for everyone.

In my Mechwarrior Online Clan we spend good amounts of times between matches discussing personal things, work, etc, not just the game.
 
There is still little incentive to upgrade if you have anything sandy bridge or later. The performance gains just arent there to justify it...unless you are a gamer. If you are a gamer you know a video card swap out will solve most of your gaming ills.
 
The hardware quality and performance has improved immeasurably.

The software quality and performance has nosedived immeasurably.

If the games houses actually spent 5% of their $100 million budget on actually QA testing the product for at least 3-4 months before release rather than a week before or just not even bothering, we could all be running Fallout 4/Witcher 3 at 1080p/High settings on 2009 hardware.

Bad lazy code is the norm nowadays. No one wants to properly test software anymore.

Coders....have some pride in your work!

Project managers...properly and realistically plan your projects so the coders can do their job properly!

I've been doing software for two decades and let me tell ya... there's no such thing as code that works. lol. The good developers are good at papering it over so the user never realizes he's handling a turd.
 
They are also in the Third Person perspective, can never split enough hairs you know ;)

Yes, you can split enough hairs or even too many. Split ends suck which is why shorter hair is much betterer because it's stupid to expect ends to not split when that part of your hair has been on your head for like many years. Anyway...

That is solved when you play games and join clans who are a little more picky about the maturity of their members. Annoying vulgar drama queens types are usually shown the digital door. Many games like World of Tanks, Warships, Mechwarrior On-Line, etc, all support Clans/Teams and some teams are mature, some are not, some place competitive gaming prowess above social graces, there is a place for everyone.

In my Mechwarrior Online Clan we spend good amounts of times between matches discussing personal things, work, etc, not just the game.

The trouble with finding mature, reasonable people in those kinds of games is that you have to put up with going through a bunch of creeps first. And you still can't control who you interact with that isn't in your group that you run into online. Sure the people on your team might be supportive of you, but there's always someone who you end up having to mute or who makes you regret you turned on your mic and started speaking.

I've been doing software for two decades and let me tell ya... there's no such thing as code that works. lol. The good developers are good at papering it over so the user never realizes he's handling a turd.

Well the crappy coding thing isn't just because of the nature of programming itself. There's like a huge number of external forces too and a lot of it can go back to a lack of knowledge on the part of the organization making a game, and the economic pressure of pushing out a product that will generate sales in a very competitive market where margins are extremely small. In fact, I think if you look at like all of the games that have been anticipated but came out totally broken, the thing you might find in common among a lot of them is that there was always a huge rush to get something out the door as fast as possible being one of the primary causes of a broken game that fans are freaking out about. So it's more of budget and money thing. Unless gamers are all willing to pay more money and wait longer for something to be created, they'll have to deal with junky-junk products.
 
Right before the introduction of dual core CPUs I was wondering how high the Ghz were going to go. Then dual core came along and clock speeds went down and then 4 or more cores came along but clock speeds didn't go ridiculously high. It just seems everything leveled off and most CPUs are good enough to keep going.

I have a 955BE, and I thought Battlefront was going to finally push me to upgrade. But I bought a R9 380, and I'm able to keep on going. Now I feel like I want to see how long I can keep pushing this CPU. I've been using it going on 6 years now. That's probably unheard of before the days of quad core.
 
Back
Top