An Essay for Programming Students

All software has bugs.

I know, but the question is more related to the size of, number of bugs, and the actual bugs themselves.

Some bugs are like a gnat, and some bugs are like a constant swarm of termites.

I can deal with gnats here and there. It's the constant swarm of termites, which cause the program to be unusable and make the pc not usable.
 
in enterprise software bugs are a fact of life. Bugs are unknown quantities until they are found. Some should have been predicted and covered in the test cases. Others are caused by users doing strange things, interactions with operating systems, permission errors, drive errors, bad data, etc. They are not really predictable until a lot of people pound the software in a production environment.
 
To be honest after programming for a couple of years, I'm amazed software runs as well as it does... things as complicated as game engines run flawlessly "most" of the time. Millions of lines of code all somehow work together to create these games that just work. I run it, join a server, and am playing with people from all over the country (sometimes world) within seconds like they're in the same room. That is just awesome... and when you break it down in the end it is just electricity being interpreted as ones and zeroes... which are interpreted as op codes by the processor... etc.

Kind of definitely, but I'm just glad I got into programming instead of IT. Fixing computers isn't exciting or challenging, especially when you just use software or hardware that you didn't even make yourself. And this post from mike was one of the reasons I felt encouraged to keep at it, because I felt like an idiot at first. Most of my classmates were finishing labs and assignments within 30 minutes while I sat there for a couple hours. Mainly because I never asked for help, I felt like it was worth it to waste an hour on a stupid mistake so that I'd never make it again. If someone else points it out for me, sure I can get done faster but I didn't learn as much from it...

Like just yesterday working on a ray-tracing / KD Tree assignment. I wasted a few hours on something stupid before realizing I already had all the information I needed and didn't need to waste time trying to transform stuff from different spaces ( screen to world and world to screen ) But hopefully now I won't make that mistake again
 
I know, but the question is more related to the size of, number of bugs, and the actual bugs themselves.
What is the specific question?

Some bugs are like a gnat, and some bugs are like a constant swarm of termites.
I'd categorize your full drive bug as a gnat. If the drive is full, then there are lots of things the computer can't do until some space is released. It's a simple issue for the user to free up some space; both the cause and the remedy are in their hands.
 
Mike-

The specific question is a rhetoical based.

i do have a question in regard to programming. I am debating doing a web development concentration for my graduate degree. The degree has Java (jsp), C# (asp.net/ado), php, and a few other script/programming languages as the base along with mobile phone programmming electives.

I have the option of taking a Java Data Structures course, if I end up choosing this route, would you recommend taking the DS class? I haven't decided yet, I will be taking a C#/ASP.net course in the fall.

I aced both of my Java courses. I will pm you the program.
 
It sounds like you're focusing on the wrong things to me. Sure, knowing a bunch of languages and technologies is great, but if you haven't taken a data structures course, you're not really going to know what you're doing.

Languages and technologies often change. Core fundamentals and problem solving can be applied to many situations in many technologies. You should know data structures and algorithms very well. Operating systems design and computer architecture will also prove very useful.
 
It sounds like you're focusing on the wrong things to me. Sure, knowing a bunch of languages and technologies is great, but if you haven't taken a data structures course, you're not really going to know what you're doing.

Languages and technologies often change. Core fundamentals and problem solving can be applied to many situations in many technologies. You should know data structures and algorithms very well. Operating systems design and computer architecture will also prove very useful.

Yeah, I had Data Structures as an undergrad. I think it was my soph year.
 
i do have a question in regard to programming. I am debating doing a web development concentration for my graduate degree. The degree has Java (jsp), C# (asp.net/ado), php, and a few other script/programming languages as the base along with mobile phone programmming electives.

I have the option of taking a Java Data Structures course, if I end up choosing this route, would you recommend taking the DS class? I haven't decided yet, I will be taking a C#/ASP.net course in the fall.

I aced both of my Java courses. I will pm you the program.

I'm not really sure I can provide you a meaningful answer. Generally, I don't see the point of post-graduate degrees. You'll go to school for more time instead of entering the workforce and gaining experience.

You want to do the post-grad program, but you don't know which one to choose. That tells me that you don't know why you're doing the post-grad program in the first place. The programs you linked seem very diverse, so you won't be mastering anything in particular when you're done.

Further, I don't know what your specific goals are. What would you be able to do after finishing the post-grad program that you couldn't do without it?
 
I'm not really sure I can provide you a meaningful answer. Generally, I don't see the point of post-graduate degrees. You'll go to school for more time instead of entering the workforce and gaining experience.

You want to do the post-grad program, but you don't know which one to choose. That tells me that you don't know why you're doing the post-grad program in the first place. The programs you linked seem very diverse, so you won't be mastering anything in particular when you're done.

Further, I don't know what your specific goals are. What would you be able to do after finishing the post-grad program that you couldn't do without it?

Graduate degrees can be used for career changes. Transition from one career to another. I am down to two selections, I was looking into a bit of insight as to if the class was worth taking if I decide to take that path. I am looking into web application programming, more along the lines of SaaS. Think in terms of Office365, not in Microsoft Office.

i will likely know after the fall semester, which of the two paths I am planning on taking.
 
Graduate degrees can be used for career changes. Transition from one career to another.
What is your present career and educational background, and how much time have you had at it?
 
What is your present career and educational background, and how much time have you had at it?

I have a Bachelor's in Business, this degree is 100% paid off and done. I have five years working as a network tech at the same company (no room at all for growth).

I am mostly asking for some insight in case I choose to do web application development.
 
Last edited:
All software has bugs.

Not to be that guy (the mega-bumper), but I would just like to chime in and state that my personal opinion is that some development practices are more prone to introducing bugs than others. I understand what you mean by 'all software has bugs', but for some reason with some developers (and I'm certainly not saying this is you), 'software has bugs' seems to has become a get-out-of-jail-free card for bad development. The standards for quality in the software industry (across many roles, even) has for a while been appallingly low. I don't want an excellent developer to go hari kari because they've shipped a bug, but I do wish there wasn't so much complacency with the scale and quantity of bugs coming out of some development shops.

When I show up at a development shop, and there's no code reviews, no unit tests, no debug asserts for pre-conditions/post-condition/invariant checks, misused exceptions/null values/logging, no automated acceptance and regression testing and the only QA work is being done by hand by a bunch of low-skilled 'testers' who don't really care enough because they're not paid much, and because the QA work is all being done by hand, they're behind on deadlines so some of the 'testing' gets flat-out skipped, and when problems are found in the QA region, they're 'fixed' within the 'test' region and shipped right up to production without going back through all of the QA channels because it takes too long, etc. etc., I find it hard to excuse bugs. If the production support staff hate the feature developers, and the developers fear their on-call rotations, this is usually a pretty strong indicator that quality is not up to par, yet in these types of shops I've often been reminded by developers, QA staff and managers alike that 'software has bugs'.

There's plenty of tools and techniques that can catch many bugs very early on. If you're programming defensively and testing rigorously, then the bugs that do get few are likely the inescapable handful that your program was destined to have. But if you're not doing very basic practices like code reviews, you're bound to fall into the 'more bugs than others' category.

Sorry for the rant, but I just wanted to make sure to emphasize (since this is a learning-oriented thread, after all) that there are too many people on the 'my code is working until proven to be broken' team and not enough people on the 'my code is broken until proven (to at least a reasonable extent) to be working' team. I would just like to see more beginning programmers start being proactive about quality before they're bitten by something that was relatively avoidable.
 
All software does have bugs. The art is deciding which are severe enough to delay (or stop) shipping. I guess I understnd where you're coming from in wanting to indoctrinate develoeprs earlier to be more aggressive about fixing problems, but I think the real solution is what you allude to in your anectdotal paragraph: that making tools and mechanisms that make bug detection and repair as easy as possible are the things that really drive quality.

OTOH, I'm weary of people who rant about every bug. They did such and so with some product and it didn't work, so straight to social media to post how whatever company is a bunch of idiots and jerks, and so on. I count posts like the one in this thread that claimed certain software was "a fine example of crap programming" in that pot; it's not "crap programming" to have a few bugs. The most likely explanation is that the programming team might have been razor sharp but pushed by business needs to ship too soon. Or, in the evidence we have so far, they're not crap at all -- they've got a single bug in the whole system, and that's not so bad.
 
OTOH, I'm weary of people who rant about every bug. They did such and so with some product and it didn't work, so straight to social media to post how whatever company is a bunch of idiots and jerks, and so on. I count posts like the one in this thread that claimed certain software was "a fine example of crap programming" in that pot; it's not "crap programming" to have a few bugs. The most likely explanation is that the programming team might have been razor sharp but pushed by business needs to ship too soon. Or, in the evidence we have so far, they're not crap at all -- they've got a single bug in the whole system, and that's not so bad.

I honestly feel that the kind of people you're describing have made my job more difficult. At the company I am at right now, I've been working to integrate quality-enabling tools and practices into our software lifecycle. Historically, previous attempts at this have made enemies within the organization, because the people who were trying to spearhead said attempts adopted the approach of vocally and aggressively informing people that the fruits of their labour were garbage. Even the very successful teams, who did partially adopt better practices and were honestly producing some very good applications were under fire from these groups, and I really don't think attacking such high visibility, highly respected projects within the company helped the cause. This time around we're taking the approach of emphasizing what better tools can do to augment our lifecycle, rather than bluntly accusing everybody from developers to VPs of being recklessly incompetent. So far things are going much more smoothly, but I still feel there's a lot of resistance to change that stems fairly heavily from 'burnt bridges'.

The old development practices had some serious problems that really needed to be addressed, but because of the non-tactful way these issues were originally approached (flippant remarks in meetings, excessively negative exit interviews, very public feedback through social media and sites like glassdoor, etc.), I think there's more push-back than there should be. I really wish I could say those people found some grand moral of the story or brilliant take-away from those past exercises, but honestly all I could come up with is: 'People get mad when you call them stupid'. And that doesn't really seem all that clever, to me.
 
Last edited:
Indeed, at the end of the day, this is really a management problem.

Over-stating the importance of particular bugs (or usability issues, or ...) ends up causing trouble for the project as a whole because it interference with effectively triaging bugs. Bugs happen; successful projects find a good way to manage them. There are lots of good ways to manage bugs (and manage projects in general), so we can be flexible in our approach. But if the measurements we use to make fundamental decisions (How severe is a given bug, and how do I compare it to all the other bug fixes and feature work that we have to do?) then we're probably not going be successful.

Personalities that drive are usually healthy as long as they remain balanced. Personalities that entrench or vilify end up being extreme cases that cause more harm than good in the long-term. (I think this is accelerated by the tendency of people to seek drama rather than productivity, but that assertion is on pretty thin ice.)

I think harmful personalities are what you're really describing, but that just comes back to management: get those folks to temper their delivery, frequency, or amplitude and you're making a big dent in the problem quickly. Building a team that realizes it isn't personal also goes a long way towards assuring that feedback -- even pretty negative news -- is received in the spirit it intended.
 
Back
Top