5 Ways Technology Is Actually Getting Worse

Wow, can't say I've seen crap this ignorant in a while.

The 5 real ways technology has gotten ass backwards.

#1 Wireless keyboards and mice. There's no reason for having it, and you have connection issues. Not to forget, batteries.

I like my wireless mouse. Your cord never gets stuck/induces drag. It's a better user experience. I never have connection issues, maybe you should try adjusting the reciever portion for the mouse setup? Also, get a mouse with a recharging cradle, no batteries need to be constantly replaced.



#2 Wireless internet at home. For the lazy person who doesn't want to run a wire to their router. Disconnection issues for Netflix and online gaming is usually because of this.

BS, going with a better solution doesn't mean people are lazy. You don't ride a horse to work, you drive a car to work. Maybe that's just because you are too lazy to learn to ride a horse and shovel $h!+, right?

Also, not everybody uses a desktop as a primary PC. Some people like to move around with a laptop a lot.

Wireless is also can be a lot cheaper than running cat5 to multiple use locations. I guess because in those cases people don't use wireless because it's economically superior, they use it because they are lazy, right?




#3 Tablets. All the power of a cell phone with the size of a laptop, for the low low price of $500+.

They might be overpriced for the hardware, but they fill a niche. It's just another tool, and just because you don't have a use for a tool doesn't mean it's worthless. For a HVAC technician chain saws are pretty useless, however for a logger they are incredibly valuable. Tablets are great for sales. They make presentations easier, and honestly, the high-tech image sells.



#4 Multi-core CPUs. Most applications don't use more then 1 core, and therefore having a dual or quad core can feel useless. Now I hear 8 cores soon? Though this is more the fault of the applications themselves.

OK, so you browse webpages and maybe send a few emails. Guess what, world =/= you. Lots of people do other things besides just basic tasks.



#5 Linux, specifically Ubuntu. Slower, consumes more battery, and yet still doesn't have a market share beyond 1%. In fact, it's lost market share compared to before.

So you're saying that computing is worse off because of a fringe desktop usage OS? Really?!?!

Oh, wait, just because you don't use it on the desktop it's useless, right?

Nevermind that:

“Forty percent of servers run Windows, 60 percent run Linux…”
–Steve Ballmer (September 2008)


Linux Starts to Eat Microsoft’s Lunch in Servers
http://gigaom.com/2010/10/12/linux-starts-to-eat-microsofts-lunch-in-servers/

-About 79 percent of companies are adding more Linux instead of other operating systems in the next five years, compared to a mere 21.3 percent which expect to add more Windows servers over the same period.
-More respondents reported that their Linux deployments are migrations from Windows than any other source, including Unix migrations.
-Two-thirds of users surveyed say that their Linux deployments are brand new deployments.
-Almost as many (60.2 percent) respondents say they will use Linux for more mission-critical workloads over the next 12 months.
-At 86.5 percent, a vast majority of respondents report that Linux is improving, and 58.4 percent say their CIOs see Linux as more strategic to the organization as compared to three years ago.
 
a company's goal is not to make the best product available. it's to make slightly better products and maximize profit from they for as long as it can.

only in war would a country or company produce and innovate at maximum efficiency.

war is not always a bad thing you know.
 
They might be overpriced for the hardware, but they fill a niche. It's just another tool, and just because you don't have a use for a tool doesn't mean it's worthless. For a HVAC technician chain saws are pretty useless, however for a logger they are incredibly valuable. Tablets are great for sales. They make presentations easier, and honestly, the high-tech image sells.

Agreed, having used them for years before most people had a clue they've been using in the business world for years. Irconic that today's mobile OS tablet's that are nothing but phone with larger screens are considered high-tech.

Windows 8 will be interesting as tablets based on it will be small computers vs big phones.
 
only in war would a country or company produce and innovate at maximum efficiency.

war is not always a bad thing you know.

Sure, if you ignore the people who die in wars and if your country gets bombed, I guess war isn't a "bad" thing. :rolleyes:
 
Tablets are great for sales. They make presentations easier, and honestly, the high-tech image sells.

This. My bread-and-water right now is web design, I've gotten noticeably more work since I got my iPad and use it for demos.
 
I have no issues whatsoever with my wireless mouse and keyboard....especially in gaming. Those of you who think you can tell the 0.000000000000001 second difference in "lag" are either retarded, superhuman, or too cheap to buy a decent wireless setup.
 
I just remember about computer sound. To me it's gotten far worse. Remember Sound Storm? How does it feel to own a modern Nforce chipset, but only to realize that the Nforce 1 & 2 had the best sound ever. I believe it's the only hardware DDL sound card to ever exist. Everything else is done with software. On top of that, most modern sound cards have no 3D sound. We now use the CPU to do all that business.

I just felt that sound got much worse nowadays. Where's my freakin SoundStorm Nvidia?


Sound Storm? pfft. I miss Aureal.
 
This. My bread-and-water right now is web design, I've gotten noticeably more work since I got my iPad and use it for demos.

But its not due to any technological superiority or ability that the iPad has, its just the popular thing right now. Any off the shelf laptop could display your web content better.
 
a company's goal is not to make the best product available. it's to make slightly better products and maximize profit from they for as long as it can.

only in war would a country or company produce and innovate at maximum efficiency.

war is not always a bad thing you know.

You're sorta right, but your making your point using the wrong metaphor.

Companies make better products for the money when they have stiff competition, and well, war is pretty much the ultimate competition.

The problem these days is lack of competition, and its due to many reasons, Government involvement, taxes, BS patent lawsuits, etc...

Competition only works when it isn't shackled by horse shit.
 
The whole idea of technology making us dumb is something that has long since bugged me and I'm against the whole concept. The only time this becomes an issue is in the extraordinary circumstance of technology being wiped out, also these things that 'there's an app for that' really only relate to things that to relate to our technological modern lifestyle. throughout my schooling I was very good at maths, top of my class primary and highschool, did calculus etc. However unlike most people, I never learned my times tables, as in i wasn't able to instantly regurgitate it, I calculate 6x7 the same way i would calculate 27x482 in my head or more to the point by using a calculator. I cannot train my brain to do things in a way that is less efficient, its like my brain wont let me waste space on useless information.

This is a ridiculous luddite mentality. oh no lets destroy the loom people can't remember how to weave by hand anymore. I suppose we should cut off our scheme water pipelines cas people have forgotten how to collect water in clay pots and carry them home. Perhaps we should get rid of our houses cas we've forgotten how to fend off wild animals trying to eat us. If an app can make my life easier and I can rely on it there is no reason to learn that skill, better to leave my internal storage for things there isn't an app for yet.

As for the shortfalls of tech. I think a major shortfall is that comparatively the utilization of hardware with a reduced instruction set is diminishing. Today its cheaper to get some generic mass produced chip that works for 'all of the above' and base the software/firmware around it. This has many disadvantages. Firstly stability is compromised, firstly because electricity isn't perfect and more circuits equates to more chance of failure, also compatibility between the software/hardware is often dubious also the hardware may be changed from underneath it due to pricing or whatever causing glitches and freezes. More circuits equals more heat, more energy consumption and shorter lifespan. It used to be a necessity for the hardware to match the software because there was only so much you could pack in there. But today hardware as with software coding wasted space is not of anyones concern. Software these days wastes a lot of space with redundant code, software used to be in bytes, then kilobytes, then megabytes and some software now gigabytes.. and really this software isn't worlds apart...

The one thing I can't stand is how programs are growing in size faster than the hardware able to support them (looking at you Crysis). Just because the CPU got faster doesn't mean we should stop making our code as efficient as possible.
 
It's perfectly normal. There's no artificial limitations put on [some SSDs].

Says Chris, author of the recent nand article here on the [H]:

[SandForce] makes the firmware, [manufacturers] like OCZ pay for which switches are turned on and distribute the products.

Wonder if some of these hidden bits affect performance and price ...
 
That article was retarded, why should be social networking bad idea? Those who don't care about their privacy were always exposed to everyone pretty fast, without or with internet. No one of the mentioned ways displayed technology is becoming worse. Probably they meant technological impact on society but still, it's shit no one cares about after all.
 
well, most points of this article are meh. i only really agree with the battery thing. why do i need 1 ghz on my android (galaxy s) if it's out of juice in just a day? even faster if you run a few widgets and let the data connection open. it may not be as bad on the iphone, but the battery issue holds mobile devices back significantly.

and you know what is really holding PCs back? 5 year old console technology. we'll never advance past directx 9 or have AA until some new generation of consoles gets released (still with ancient and cheap tech). whenever i turn on my 360 the jagged edges of polygons make me sick.

Really?
1) [...] Wired on the other hand has always been a bane when it comes to the cable hanging when I need to move the mouse fast. Wireless keyboard, ok I'll agree..pointless.

how far do you move your mouse? like, a meter? I can't remember a single situation ever where the cable hung on something and the additional "drag" that someone above mentioned is just ridiculous. I can't feel any "drag" at all if i lift the mouse and drag the cable around. maybe my desk is just exceptionally smooth and more common desks have sandpaper for a surface. i understand that it's just a matter of preference and what you got used to so use whatever floats your boat. for me wireless mice and keyboards are useless because the battery adds significant weight to the mouse and i hate that.
 
Everything Apple "invents" takes technological progress backward.

iPod - batteries non-replaceable, lost expansion ability for years in most players, MP3s took backseat to proprietary files
iPhone - camera quality got worse, keyboards mostly disappeared from smartphones, batteries stayed small
iPad - cheap netbooks disappeared, replaced by less practical, expensive tablets.
 
iPad - cheap netbooks disappeared, replaced by less practical, expensive tablets.

Not exactly. The reality is that full fledged notebooks basically have become priced like netbooks were. My sister just picked up a cheapo laptop, $280. Nothing great but perfect for her needs and price was right. Netbooks simply are just getting priced out on the low end by laptops.
 
1) No reason to have Wired. Wireless has same performance, and a "GOOD" wireless mouse has li-ion batteries (see logitech G7 and Performance MX). No connection issues, yet again quit buying cheap crap. Wired on the other hand has always been a bane when it comes to the cable hanging when I need to move the mouse fast. Wireless keyboard, ok I'll agree..pointless.
As a gamer, you stick with wired for the response time. There was a point where people would even measure the latency on all their devices, including mice.
2) Has nothing to do with being lazy and everything to do with being unnecessary. Remember the quality of mouse comment? Same goes for a router. Buy a good router and connection speed is a non issue. I stream Netflix HD and 720p content wirelessly all day without issue to my PS3. Only limiting factor on 1080 is the wireless built into the ps3, I can stream it to my laptop fine. Quit buying cheap crap and assuming because your ghetto box 2000 sucked, everything else must as well.
As a person who plays games, I hate dealing with people who use Wifi. They're guaranteed to disconnect. No matter how good your wifi is, your download speed is going to be faster wired.

Wifi is good for laptops and portable devices, that visit the internet and such, but I would never trust it for gaming or for streaming video. It's worse when ISPs supply you the router, especially with Verizon where you have no choice but to use their router. Which has to be the worst wifi router ever made.

Can you have good wifi? Sure you can, but wired is always better. For ping, for throughput, and reliability. If your a gamer, it's wired.

4) You clearly aren't a gamer, programmer or an artist..I have more then a few applications that take full advantage of my 6 core. This easily has to go down as one of the dumbest statements I have witnessed this year.
I do all the things you said, but I feel multicore cpus is the wrong way to go. I'd rather see Intel or AMD find a way to treat multicore cpus as a single core. Rather then on relying on software developers to rewrite their sofware. As a gamer, a lot of games aren't taking advantage of multicore, let alone 64-bit.
 
As a gamer, you stick with wired for the response time. There was a point where people would even measure the latency on all their devices, including mice.

Competitive gaming, sure. The games I play (nearly all SP) don't really matter.

As a person who plays games, I hate dealing with people who use Wifi. They're guaranteed to disconnect. No matter how good your wifi is, your download speed is going to be faster wired.

Wifi is good for laptops and portable devices, that visit the internet and such, but I would never trust it for gaming or for streaming video. It's worse when ISPs supply you the router, especially with Verizon where you have no choice but to use their router. Which has to be the worst wifi router ever made.

Can you have good wifi? Sure you can, but wired is always better. For ping, for throughput, and reliability. If your a gamer, it's wired.

Gaming PC is wired, the rest of the house is wireless. Great for the wife's PC & ipad, great for company, great for the occasional gaming session in the living room on the laptop. No drilling or drywall patching required.


I do all the things you said, but I feel multicore cpus is the wrong way to go. I'd rather see Intel or AMD find a way to treat multicore cpus as a single core. Rather then on relying on software developers to rewrite their sofware. As a gamer, a lot of games aren't taking advantage of multicore, let alone 64-bit.

There's more to life than games. Leaving one core available during a render (or whatever) is handy so I can surf (or whatever) while something else is running. It can also improve the performance of games a bit--just upgraded an old Athlon to dual core to extend its life and it made a huge difference.
 
There's more to life than games. Leaving one core available during a render (or whatever) is handy so I can surf (or whatever) while something else is running. It can also improve the performance of games a bit--just upgraded an old Athlon to dual core to extend its life and it made a huge difference.
I'm not saying that everyone should stick with single core, but why can't engineers find a way to feed a single threaded application into a multicore cpu?
 
I'm not saying that everyone should stick with single core, but why can't engineers find a way to feed a single threaded application into a multicore cpu?

You'd never be able to get the entire app to run single threaded across multiple cores because the registers/cache would have to be duplicated (preferably at the hardware level), and probably some semblance of 'state' maintained as well. It would wind up making the app. less efficient.

Most apps can offload different operations across CPUs, but then you have a multithreaded app. ;)
 
As a gamer, you stick with wired for the response time. There was a point where people would even measure the latency on all their devices, including mice.

Was and still is are two entirely different things. I am a competition level gamer as mentioned previously and I use a wired mouse. Perhaps at the extreme upper levels of sc2 with 300+ APM's you "Might" be able to see a difference. I know in intense matches I have hit 200-225 sustained APM and have yet to ever hit a latency issue. It is a moot point based on out of date information.

As a person who plays games, I hate dealing with people who use Wifi. They're guaranteed to disconnect. No matter how good your wifi is, your download speed is going to be faster wired.

Main gaming PC is clearly wired. Again I was talking about streaming. I do some online gaming on the wireless with the ps3 but not much. I have never lagged or disconnected and hate laggers as much as you. Point is, wireless is not useless as you mentioned.

Wifi is good for laptops and portable devices, that visit the internet and such, but I would never trust it for gaming or for streaming video. It's worse when ISPs supply you the router, especially with Verizon where you have no choice but to use their router. Which has to be the worst wifi router ever made.

Again, see the above. Works fine for streaming video if you have a router that is not garbage.

Can you have good wifi? Sure you can, but wired is always better. For ping, for throughput, and reliability. If your a gamer, it's wired.

Well no joke wired is better.

I do all the things you said, but I feel multicore cpus is the wrong way to go. I'd rather see Intel or AMD find a way to treat multicore cpus as a single core. Rather then on relying on software developers to rewrite their sofware. As a gamer, a lot of games aren't taking advantage of multicore, let alone 64-bit.

Multi-core cpu's came about because they were hitting limitations for speed and heat constraints on single core silicon. While a new material may eventually eliminate that issue, for now it is the single best option. I will take an engineers opinion on this matter over yours. Very few new games don't benefit from multi-core or 64 bit. While they may not directly take advantage of the specific feature set, they still benefit from being on their own dedicated core vs sharing with all the other applications. Your argument here is utterly absurd. Have you not learned about setting core affinity yet? Even for games not optimized for multicore, that is a huge benefit to them.
 
I use a wireless mouse in the above post btw, mistype. Bloody no edit..at least a 30 second edit window would be nice guys.
 
What bandwidth caps? I've never had a bandwidth cap(s) on my line(s) going back to ISDN days...
 
What bandwidth caps? I've never had a bandwidth cap(s) on my line(s) going back to ISDN days...

Most Cell providers have a soft cap (2-5gb) where they either yell at you or slow your speed (Tmobile is infamous for offering 4G cellular with a 2gb/month soft cap, after which your back to EDGE)

Most (all?) Cable internet providers have a hard cap of usually 200-250gb/month after which your charged overages, slowed, forced to the next tier (rarely) or a combination of any of the 3

My DSL (Windstream) has a soft cap (not 100% sure where) that it cuts my speed in half.

The Fiber I'm considering getting in Oct/Nov has a cap too that's "more than most houses will ever use" but nobody can tell me exactly what...

Oh and on Wired/Wireless:
I have an old logitech wireless mouse I use with my laptop at times, and while I'd lever use it for twitch gaming (I mean, it's Logitech!) I've never noticed any problems with accuracy in Maya or responce-time in older games. The only reason my current mouse is wired is because I like the side buttons and dont want to drop $75 for a new one with them just to "cut the cable"...
 
I downloaded a bunch with 4G on Sprint this month, and noticed that my speeds dropped back to 3G even with the 4G icon active, and I had a bunch of 3G transfer on my account (unlimited data plan). There seems to be no 4G transfer amount on the account pages. Also, 4G is only about half the speed it was last year.
 
I downloaded a bunch with 4G on Sprint this month, and noticed that my speeds dropped back to 3G even with the 4G icon active, and I had a bunch of 3G transfer on my account (unlimited data plan). There seems to be no 4G transfer amount on the account pages. Also, 4G is only about half the speed it was last year.

I played with 4G at my local sprint store and literally couldnt tell a difference... (however down near Atlanta earlier this year I noticed 4G 5mbps vs 3G 1.5mbps - has it really dropped that much?!)
 
I played with 4G at my local sprint store and literally couldnt tell a difference... (however down near Atlanta earlier this year I noticed 4G 5mbps vs 3G 1.5mbps - has it really dropped that much?!)

Try Verizon 4G LTE, you'll notice a difference. Of course it's VERY expensive. It's faster in spots than my Time Warner Turbo cable modem.
 
I played with 4G at my local sprint store and literally couldnt tell a difference... (however down near Atlanta earlier this year I noticed 4G 5mbps vs 3G 1.5mbps - has it really dropped that much?!)

3G here about 40-50KB/s ... 4G was 450KB/s last year, but now about 200KB/s. The Sprint store probably has about 12 bars of reception ...
 
Cell Phones.Cell phones suck these days. Digital voice sucks.

I could understand people a whole lot better back in the analog days.

Now they have compressed the signals so much (and continue to increase compression so they can keep adding users without adding equipment) that a persons voice is barely audible.

56k MP3's sounds better.
I noticed a clear difference when I switched from T-mobile to AT&T. In addition to just sounding worse, they mute the audio when you're not talking to save bandwidth. This, of course, is not perfect, and anyone listening closely gets to hear a radio-like conversation. It's annoying as hell.
 
Meh, I thought the article was BS. Then again, I'm not a "power" user. I don't get anywhere near my bandwidth limits on either my phone or my home internet. The only mobile device I use is my phone so I don't have a lot of battery operated devices. And, I don't use any sort of cloud. I'll keep all my files to myself.
 
SSDs are that are not only differentiated by size, but also artificially slowed down for the smaller sizes. What happened to tech companies making the best product possible? Perhaps they should spend more time on validation since they all seem to have issues, and less time on artificial performance caps.

maybe you were born yesterday
 
3G here about 40-50KB/s ... 4G was 450KB/s last year, but now about 200KB/s. The Sprint store probably has about 12 bars of reception ...

Geeze...

No, acctually, I get 3bars of 3G myself @~ 1.0mbps usually (when I had my EVO active I got 1.5mbps), the Sprint store had a radical flux between 0 and 4 bars of 4G on the Evo Tablet I was playing with. (according to the rep they've been having problems lately with the 4G towers not working right after storms...)
 
Sorry, but disqualifying you employment due to abject stupidity in posting something ridiculously stupid - IN PUBLIC - is not "discrimination".

Employers using social media to do psudo background checks for information that should never be available to them and that verges on privacy invasion and discrimination is a huge step backwards. There are plenty of reasons why I completely refuse to use crap like facebook and twitter, but this is a major one.
 
Nonsense. Stupid people do stupid things. If you as an employer don't think your employee - who is dumb enough to both embarass themselves in public AND post about it on Facebook - isn't dumb enough to either screw up unintentionally, or screw over the company for their personal benefit, you simply don't know enough people.

While I agree that people are stupid for posting their info out there, fact remains employers shouldn't be able to use it against them.

Private life != type of employee.
 
No, you're correct. He has no idea what the hell he's talking about. Every SSD controller has a fixed amount of lanes. The more storage you have, the more lanes that are occupied. There's nothing artificial about it - it's simple math.

Not to mention that even if what he stated were true, actual HDDs had crippled capacity to serve that market. If it costs the same to make 80gb and 160gb HDDs, it makes sense to charge a little bit more for the 160gb and profit. This is still currently done with CPUs and and graphics cards.
 
When a line's top HDD is a 4-platter 1TB product, the 1/2TB drive is only going to have 2 platters and less heads and run cooler. Or, same number of platters and only use the outside edge (faster). Not sure that qualifies as "crippled". GPUs are generally about yield quality (functional blocks). CPUs ... guilty as charged.
 
Back
Top