Have Software Developers Given Up?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
While I'm sure most companies don't intentionally treat their customers like beta testers, there are others *cough*Adobe*cough* that don't seem to really care that their products are riddled with security flaws. Hell, maybe software developers have just given up.

Over the last few years it feels like the quality of software and services across the industry is falling rather than climbing. Everything is always beta (both in name and quality). Things are shipped when marketing wants them to rather than when they’re ready because “we can easily patch them”. End users have basically become testers, but it’s ok, because this is Agile. We’ve started coding to expect failure and somehow with it decided that failure is normal and expected and we don’t need to put so much effort into avoiding it. Supporting millions of customers is complicated so we don’t bother. Why waste time reading bug reports from users when you can just send them into an endless maze of help links with no contact information?
 
i dont know about given up, the examples presented seem bad and quite stupid, but not being a programmer I can imagine code is becoming increasingly complex.??
 
I don't know about most software developers but my management constantly pushes me to release features regardless if they're finished or tested, based on a timeline that they set before they even talk to me. Is it any wonder the product is buggy as hell? No, definitely not.
 
This sounds like more of an obnoxious rant than anything.
 
  • Like
Reactions: pxc
like this
I am not a coder, but in the past few companies I've worked for, I am often shocked at what gets pushed out now, vs a company I worked for for 6 years between 2006 and 2012 where any updates and changes were done over a weekend, announced in advance, and tested through several iterations before it even got to that point.

Night and day.
 
I think the programmers have given up trying to be reasonable with the bureaucracy that controls the money letting them create or support their products. I see it all the time in my father as he goes on epic rants about the system he is responsible for.
 
Errare humanum est. How often do [H] articles have errors - from simple typos to bad grammar to mis-copied tables to more - in them? And I'm sure all those articles were checked. It's how we recover from errors that counts. (And here [H] is excellent.)
 
Errare humanum est. How often do [H] articles have errors - from simple typos to bad grammar to mis-copied tables to more - in them? And I'm sure all those articles were checked. It's how we recover from errors that counts. (And here [H] is excellent.)

Very true. But, overall it seems like the standard of quality is slipping. While 5 years ago, 2 grammatical errors would have been acceptable (even if fixed in 2 hours), if now there were 6 errors and gradually getting worse year over year, it doesn't look good. Keep quality high. Mistakes happen. Just don't see those mistakes and say "ship it anyway, we'll work on a fix...". No, you fix the mistake that you noticed and then ship it.
 
As a software engineer, I can really say that bureaucracy has it's way of screwing with your head. You're normally being micromanaged by people who have no idea how the programming works, or how a decent data flow diagram of what you're trying to program actually translates.

In game programming, it's a snowball of expectations rising. People are used to X benchmark, so you have to deliver that, or people won't even bite. The increasing demand for good graphics creates a mountain of money to climb to even push a game out now.

It's a 2016 thing, people all feel like they're 'involved' in the system now; even the people who can't write one line of code. So the minute something isn't how they wanted it in their head, they scream "SHIT". It could even be something as simple as moving the button 4 pixels to the right.

Especially when work is involved, it's easier to blame other people than it is to just work with the system that you have and enjoy that it's possible.
 
Every time Ubisoft releases a buggy half-finished game the forums are filled with people saying they will never buy another Ubi game. Then as soon as the next game is announced those same people can't wait to pre-order it. When your customers don't really care about the quality of your product, why should you?
 
I think most of this is simply a capacity problem. More lines of code expected at faster and faster cadences and the resources and tools not always keeping up.
 
There are at least two problems here culminating into the one big problem of errors everywhere.

1) The web is forgiving. This goes for the technology as well as the users. So many of the issues showcased were web-based errors. From the start of HTML, we've seen how forgiving the web can be. With numerous technologies built onto a website to make it function, we've seen that same forgiveness built into these technologies. And since HTML always has been so forgiving, lack of QA on the web has always been something websites have been plagued with. As a result, it's something we as users have simply tolerated over the years as those problems continue to compound and grow into issues that truly do break a site.

2) Too many cooks with too many pots. Developers don't work on a single project any more, and projects aren't developed by single developers anymore. With a developer involved in so many projects, it's hard to ensure that quality is controlled. Mixed that with so many developers adding to each project, and now no one can ensure that everything being added fits the quality standards needed/desired for the project. There's a very good reason to have multiple people on a project - if your sole developer was hit by a bus, you've not only lost a great developer but also possibly the entire project! But there is something to be said for ensuring quality when code is added or changed.

There are likely a few more points here - such as "many developers ARE lazy," and "many products ARE driven by marketing teams that have no idea how long it really takes to make X-Product/Patch," etc - but these two seem to be the key points I observe.
 
Used to be you actually saw much of the code in a project. Now most IDE platforms hide much of the code. Add a control, type a bit of code, click and code is out of sight. Not to mention the nasty habit of linking in unverified libraries from 3rd party sites. My guess is most coders do the bits of a screen they are tasked with doing and have little or no idea who coded what or how the rest of the project was coded. Plus there is the problem that age and experience are often considered bad rather then an asset.
 
Old people leave with questionable architecture based on old technology. New kids come in with less knowledge of old tech, and even less on the architecture and patch it up. Legacy code can exist from the 1960's in some cases. (Case in point: Banks and COBOL)

Hazard of the industry, and why you should:

1. Have at least 50% coverage rule. (At least two people know that section of code. And if possible, 3 people.) If you have 3 people, you divide the code up into 3 sections. Person 1 knows A,B. Person 2 knows B, C and Person 3 knows C, A That way if person 2 leaves, person 1 can go, "Well only I know B, C so you have to pay me double or you're doubly in trouble."

2. If they are the architect, pay them handsomely. Retain what talent you have.
 
In my experience its that there isn't enough time given to development, its not necessarily that the bugs are noticed and they say ship it anyways. What I have seen is that such and such feature is supposed to be done and tested by this date, and was originally planned with so many days/weeks of testing. Well development gets it done by the date but complications came up and cut into testing time and instead of having say weeks of testing it was just days, thats not enough time to even find the bugs let alone fix them.
 
Hurry up, we have to update at the end of this sprint!
 
I've worked with supposedly good companies, and in the sectors of government, finance, and private industry. I've heard complaints in every company about how the quality is terrible, and would never be accepted elsewhere. Guess what? It probably is.

I wouldn't say people have given up. Rather, priorities have changed. The best code I've seen has been when I've been working with embedded devices, where memory and speed come at a premium, and you can't fix things. And this probably was true 30+ years ago with mainstream computers. But marketing does control things, and is partially why I despise them. When resources are limited, the developers have power over the marketing team, and the developers can dictate terms. When they're not, well, that's when you get unreasonable deadlines with subpar code.

It's also why you won't see nearly as many superstar hackers anymore. I can hire a new college grad for 1/10th of the price who can do 1/3 of the work. And I can get 4-5 of them who can do 90% of the work. So what if it's buggy and slow... Tomorrow we'll have faster machines, and if something crashes, well, it's just faster to document that and tell the user not to do that than to take time to actually fix it.
 
This is the problem:

I implemented automated testing, continuous integration, agile development, regression testing, automatic bug reports, and user experience metrics. Any time or effort I saved is funnelled into new products or features. The same thing is true if I work an insane amount of unpaid overtime. I am always running in place. Unless I start lying, I don't get the time to solidify the products we have.
 
This is the problem:

I implemented automated testing, continuous integration, agile development, regression testing, automatic bug reports, and user experience metrics. Any time or effort I saved is funnelled into new products or features. The same thing is true if I work an insane amount of unpaid overtime. I am always running in place. Unless I start lying, I don't get the time to solidify the products we have.

Youre doing it wrong. You need to be like Scotty. If it takes one week to do what they are asking, tell them it takes two so you have time to test and optimize properly.
 
People buy regardless. Most seem to have accepted broken and barely working until a few patches down the road as the norm. There is no roi on producing a reduced bug product. Or, at least it appears that way. Why should the dev care when the publisher does not since the customers keeps giving them money despite a product being less than ready. Sometimes way less than ready.
 
We are in a very 'need it now' culture, so people just don't have the time or resources to test properly because people can't wait the extra six months for things to be done properly, and they don't want to pay for more testers.
 
The big shops like Adobe and Microsoft aren't exactly going to have the best software developers, for the same reason why McDonalds doesn't have the best cooks.
 
Every time Ubisoft releases a buggy half-finished game the forums are filled with people saying they will never buy another Ubi game. Then as soon as the next game is announced those same people can't wait to pre-order it. When your customers don't really care about the quality of your product, why should you?
I havent bought a ubi game n a good while...... though I am nterested in that for honor game.

That said, Ubi has many different developers, so it's not really the same as a single group of people failing.
 
If the offices of Adobe and Oracle burned down overnight with no loss of human life... I'd probably smile.
 
Software developers across the board have gotten lazy because none can make use of all these cores intel PC's have been pushing out. 4core is no different than 6+core most of the time unless you use a handful of products.
 
Hurry up, we have to update at the end of this sprint!
LOL!!!

Wait.. I'm confused... Does that make this thread an epic or a story?

But ya.. Long story short, I blame agile... As software and features grow with each release which is set by a sprint schedule, qa time remains constant... So within each sprint, qa will need to test more and more, but are not given any additional time. That said, sales and finance love agile.
 
Agile done correctly is all about getting the appropriate time and realistic goals. My wife is an Agile coach, and she has nightmares about all of the would be 'Agile' teams just doing it wrong. Many times my wife has outright refused to agree to the restrictive timeframes demanded of her teams, because she wants to avoid what you describe.

To the main topic: Software devs have not gotten lazy, the scope of software has gotten HUGE. Developers are putting JUST as much effort into their work as they used to 'back in the good old days' but you simply don't have weeks to cut down a kilobyte of RAM usage, let alone 500 MB. Vulnerabilities are ridiculous nowadays: the average team puts WAY more effort into security and vulnerability than 'back in the good old days' because the scope of a 'good old days' application was 1% of what applications can do now.
 
I think the programmers have given up trying to be reasonable with the bureaucracy that controls the money letting them create or support their products. I see it all the time in my father as he goes on epic rants about the system he is responsible for.

I agree. Where I work I'm not a programmer but it's at a large company and I see the same type of bureaucratic pressure put on the workers to optimize profits. Everybody is always expected to do more than they can, but then at the same time nobody ever blames you for not getting everything done because they know you really can't even though they ask you to. It's a revolving door too, people get burnt out and resentful for being asked to work extra and come in on days off. Not a lot of people have worked there for more than a year.

Instead of just hiring more workers so that all the work could get done with a comfortable margin, they slice the employee numbers down to the bare minimum and then make them work extra.

Apparently this is the most profitable system, at least in the short term. I don't think this type of environment builds strong long term success. But it seems like a lot of big businesses these days are after high quarterly reports as a primary goal.
 
Software development seems to encompass a lot of jobs where fresh meat comes in just as fast as it is ground up and spit out. I think this is what has allowed it to get to the point we are now at. Any slowdown in the fresh meat is handled with indentured servants, I mean H1B1 employees.
 
The manager's mantra:

quote-Thomas-Carlyle-no-pressure-no-diamonds-92175.png


The manager's idol:

24.jpg


The manager's logic:

stupid-logic_o_1010508.jpg
 
oh yeah let me add abweird bug not addressed... i cant see Google news page in chrome in my android 5 based tablet.. weird, not fixed for a long time..
 
Don't you just hate it when all the coffee creamer is gone by the coffee machine....

/Andy Rooney
 
As a dev, i don't know how to 'give up'...but i do know how to give 'minimal effort' if/when that backlog starts really piling up thanks to scope creep / moving targets introduced by clients who don't understand that you don't build the entire damn thing in a single sprint.
 
Back
Top