US Government Is Spending Billions On Old Tech That Barely Works

No one probably RTA, but this is the crazy part:
$61 billion was spent on operations and maintenance, [$19B] was on development and enhancement, such as purchasing new systems or expanding existing ones.
Over 3/4 of the budget is spent on maintenance? There has to be some massive waste going on.

A Paris airport was forced to shut down earlier this month after a computer running Windows 3.1, a prehistoric operating system from 23 years ago, crashed in bad weather.
Stability comes into play at some point.
 
It's not that I mind using old hardware and software when it's working fine and doing the job fine and there's no gains by upgrading.

What I do mind is when anyone is running something in a critical application and replacement parts or systems don't exist or would require several weeks and some shady Ebay purchases at hostage rates to try to find a working part.

I've done a few jobs over the years where I have looked a customer in the eye and said, "I know what you are doing with this system, AND how long you intend to keep it (forever). You need to just shut up and buy another motherboard/RAID card/whatever right now and put it in a locked cabinet because in 5-10 years these parts won't exist and if it dies you're screwed."
 
Old tech in modern places are everywhere. I've seen hospitals that still use Windows XP with a network infrastructure that's still 10 Mbps.
 
Actually, quite untrue, there are many mission critical systems that use Windows and for very good reasons. There is far more direct support for windows applications than there are for other OS's and there are far more ports for certain applications on Windows than there are for other systems. Also the cost to design the program for a different OS usually exceeds what is allotted for the budget, so they remain on Windows.

Having something mission critical on Windows does not make it smart. What your'e saying is that they're cutting corners to save cost and make their lives easyer and jeopardize the whole project as a result. Windows has so many inherent security problems and it's so heavily targeted by every imaginable criminal on this planet that it will never be a good option for mission critical operations.

For government work, nothing beats an OS and modules you can code internally. The cost will be way less than what they're paying now to Microsoft lol.
 
It's not just that.

There are many systems in the world that talk to other systems that are not so easy to update, like 20 year old satellites. Not only do they have to make sure they don't mess with their ability to manage the distant systems, they also can not afford any downtime, none. I interviewed for such a company a few years back and not only are they way back there when it comes to running older systems they also spend a great deal of time before they even apply a patch running it against simulators and such before it ever get's pushed live. Some things you just down mess around with, no cowboy stuff.

For their mission critical stuff, the government is light years ahead of places like Google in terms of stability. No use of pointers, so everything goes on the stack. And despite the overall average being lower, they actually do employ some pretty intelligent people for those small areas. The issue is, not a lot of things are considered mission critical, and we laugh when other no so important stuff seem to be held back to the hardware standards, but not software coding standards.
 
Having something mission critical on Windows does not make it smart. What your'e saying is that they're cutting corners to save cost and make their lives easyer and jeopardize the whole project as a result. Windows has so many inherent security problems and it's so heavily targeted by every imaginable criminal on this planet that it will never be a good option for mission critical operations.

For government work, nothing beats an OS and modules you can code internally. The cost will be way less than what they're paying now to Microsoft lol.

The ability to audit the code yourself and a very small footprint when you need it (like for embedded stuff) is great for security. I too would, for some reason, feel more confident running a webpage on LAMP than on Windows Server.
But, this gives some admins a false sense of security. It doesn't get viruses = secure. I see plenty of people not taking the proper time to RTFM and wind up with hackjob servers with service daemons running as root or web directories with 777 permissions because "something needed write permissions and we didn't bother checking what and why".
Even on Windows, when I had to install Apache on it, I made sure the service operated as a limited user with explicit ACL denial entries for Important Stuff.
This is often glanced over and then we get open relays and such.
Some mission critical stuff actually needs Windows for optimal results. You could run a domain using Samba and Windows clients, but still I prefer Windows because there's no risk an update will break it or someone requests an obscure feature.
 
Having something mission critical on Windows does not make it smart. What your'e saying is that they're cutting corners to save cost and make their lives easyer and jeopardize the whole project as a result. Windows has so many inherent security problems and it's so heavily targeted by every imaginable criminal on this planet that it will never be a good option for mission critical operations.

For government work, nothing beats an OS and modules you can code internally. The cost will be way less than what they're paying now to Microsoft lol.

B00nie, I am sorry but you really have no idea what you are talking about here. There are a lot more security suites for these setups in Windows than there are for Linux. Also there are tons of people in the Government that write code for Windows as well as applications for it. So I am not really sure where you are getting at there. No matter what OS you use, there are going to be vulnerabilities and in fact there have been some very serious vulnerabilities in packages that come with Linux software in the past 2 years. Those vulnerabilities have affected a great number of devices including servers, special project builds, networking equipment, analysis equipment, etc. So Windows is not in fact the least secure option. Those of us who work in the mission critical areas of the government are well aware of the issues we face everyday with our applications, but just saying you are going to use a different platform does not inherently make the project any safer.
 
B00nie, I am sorry but you really have no idea what you are talking about here. There are a lot more security suites for these setups in Windows than there are for Linux. Also there are tons of people in the Government that write code for Windows as well as applications for it. So I am not really sure where you are getting at there. No matter what OS you use, there are going to be vulnerabilities and in fact there have been some very serious vulnerabilities in packages that come with Linux software in the past 2 years. Those vulnerabilities have affected a great number of devices including servers, special project builds, networking equipment, analysis equipment, etc. So Windows is not in fact the least secure option. Those of us who work in the mission critical areas of the government are well aware of the issues we face everyday with our applications, but just saying you are going to use a different platform does not inherently make the project any safer.
Security suites? With closed source you can never be sure of the security of your system. Especially using a desktop OS as the base with the horrible mess of Windows updates lately.
Using an open source you can audit literlly every bit of your code until it's guaranteed to be secure. You can never do that with Windows.
 
Security suites? With closed source you can never be sure of the security of your system. Especially using a desktop OS as the base with the horrible mess of Windows updates lately.
Using an open source you can audit literlly every bit of your code until it's guaranteed to be secure. You can never do that with Windows.

Again, you are making statements that are just completely untrue. Closed source does not mean you cannot be sure of the security of your system. There are all sorts of tools that can be used to determine that. Also, I don't think you really understand how enterprise's do their updates and system control. The Windows updates are not really a mess at all, I am not even sure where you are getting that from.

Using open source you cannot audit literally every bit of code. Just because you may have an open source system, doesn't mean everything in that system is open source. Nor does it mean you understand everything in that system. All systems are complete, there really is no difference in open source and closed source as far as determining the overall security. In fact, typically there are more problems in open source because attackers know what is in your system more easily and can directly attack it.
 
Again, you are making statements that are just completely untrue. Closed source does not mean you cannot be sure of the security of your system. There are all sorts of tools that can be used to determine that. Also, I don't think you really understand how enterprise's do their updates and system control. The Windows updates are not really a mess at all, I am not even sure where you are getting that from.

Using open source you cannot audit literally every bit of code. Just because you may have an open source system, doesn't mean everything in that system is open source. Nor does it mean you understand everything in that system. All systems are complete, there really is no difference in open source and closed source as far as determining the overall security. In fact, typically there are more problems in open source because attackers know what is in your system more easily and can directly attack it.
Whatever mr. Microsoft employee.
 
Why not, Tony Stark does it??

Well, not to be all TeamCap or anything, but Stark did (unwittingly) unleash Ultron. ;) Yeah, in real life, Stark would probably have some kind of "accident", due to the OGAs of any number of countries ...
 
Virtualized that ancient crap. 1 esxi cluster could run the entire defense dept.


I can't decide if your are totally clueless or just trolling. The DoD uses more virtualization than most would believe. The Army runs the single largest Enterprise in the world. It's so large that Microsoft told them it couldn't be done and they went right on and did it anyway and made it work.

And at the most basic level, your actually totally incorrect if for no other reason then Security Classification requirements mandate that there be no fewer then 4 independent and unconnected networks which means you would at least need four clusters, but there are probably hundreds of them. Just off the top of my head I know there would be the same 4 at every military installation/base. So four clusters just became like 400 clusters. I won't bother to take this any further, my point should be made.
 
Aside from lazynes it's really a communication issue. The current system for upgrades in the government is this.

1) Find new hardware, must come from government approved vendors, if not then fill out waiver form. Get approval from higher ups, supervisor, manager.
2) Is it compliant? (various rules and regulations regarding hardware to be as accessible as possible) If not and can't be then fill out a form to waiver this compliance. This too needs approval, usually from administrator.
3) Get executive approval which is usually from another district/office, this process alone could take months. Finally get approval to place order.
4) Find someone who knows how the hardware/software works, if not, hire a contractor (this can delay the process even longer by up to a year).
5) At this point the hardware we originally requested is already out of date, then various software updates need to be performed for it to fall under executive office compliance regulations.
6) Beta test at smaller districts, usually for a period of 2-6 months depending on what we're changing.
7) Go through final distribution and installation, another 3-6 months depending on office size. By this point whatever computer hardware you originally intended to upgrade to is already 1-2 years old.

It literally takes me 2 months to receive ink cartridges when I place orders at my office. Because of this most employees in the government have a "If it ain't broke, don't fix it." mentality.

The government is actually doing better lately. We're upgrading to Windows 10 this year while we had only just upgraded to Windows 7 in 2012. I do work in one of the larger districts so we tend to get upgrades as soon as they're finalized and approved.
 
Last edited:
Aside from lazynes it's really a communication issue. The current system for upgrades in the government is this.

1) Find new hardware, must come from government approved vendors, if not then fill out waiver form. Get approval from higher ups, supervisor, manager.
2) Is it compliant? (various rules and regulations regarding hardware to be as accessible as possible) If not and can't be then fill out a form to waiver this compliance. This too needs approval, usually from administrator.
3) Get executive approval which is usually from another district/office, this process alone could take months. Finally get approval to place order.
4) Find someone who knows how the hardware/software works, if not, hire a contractor (this can delay the process even longer by up to a year).
5) At this point the hardware we originally requested is already out of date, then various software updates need to be performed for it to fall under executive office compliance regulations.
6) Go through final distribution and installation, another 3-6 months depending on office size.

It literally takes me 2 months to receive ink cartridges when I place orders at my office.

You sound like my brother in law in the navy and says it can take almost half that amount of time to get a new power tool to replace a broken one.
 
Security suites? With closed source you can never be sure of the security of your system. Especially using a desktop OS as the base with the horrible mess of Windows updates lately.
Using an open source you can audit literlly every bit of your code until it's guaranteed to be secure. You can never do that with Windows.

B00nie, Do you know that the DoD and most of the Federal Government for that matter, don't do MS Updates from Microsoft?

For example, the DoD uses their own customized OS builds, they were called Army Gold Master, (AGM), and I think they are now called UGM, (Unified Gold Master or something like that). And as such, the Army looks over MS patches and decides what they will server for updates, which updates they will not serve, and which ones they will find other ways to mitigate. Now I am not saying that they have never pushed out a bad update but it's not a mess like you claim. They just don't take Microsoft's products "at their word" and trust that they are working properly. They have teams that test those patches against the current deployment builds and tests the deployment of the updates as well.

It just sounds like you are unaware of how things really are with the DoD and much of the Government's IT in general. That being said, there are parts of the government which are not up to these same standards.
 
Whatever mr. Microsoft employee.

Actually, at work I use primarily Debian, RedHat, and other proprietary open source solutions for mission critical operations in the government. It is just clear you have no idea how these things are managed or run in an enterprise. You keep making statements that are simply not true and have no basis in facts.

B00nie, Do you know that the DoD and most of the Federal Government for that matter, don't do MS Updates from Microsoft?

For example, the DoD uses their own customized OS builds, they were called Army Gold Master, (AGM), and I think they are now called UGM, (Unified Gold Master or something like that). And as such, the Army looks over MS patches and decides what they will server for updates, which updates they will not serve, and which ones they will find other ways to mitigate. Now I am not saying that they have never pushed out a bad update but it's not a mess like you claim. They just don't take Microsoft's products "at their word" and trust that they are working properly. They have teams that test those patches against the current deployment builds and tests the deployment of the updates as well.

It just sounds like you are unaware of how things really are with the DoD and much of the Government's IT in general. That being said, there are parts of the government which are not up to these same standards.

That is a pretty good depiction. To sum up what really happens is the government gets the updates, tests them on their core builds in VMs for awhile to make sure they do not conflict with all the software packages they have installed, and only after they are vetted do they push them out. This is the same practice that Enterprise companies use. The AGM is mainly for the Army, the images change from department to department. For instance DISA uses a different image than the Army, who uses a different image than the NSA, and so on and so forth. Also even within a department like the Army, certain organizations will have their own image that is a bit different from the AGM. The AGM is just a base image and often many things are installed on top of it to increase security. There are many COTS, GOTS and program proprietary installs that are used.
 
Last edited:
You sound like my brother in law in the navy and says it can take almost half that amount of time to get a new power tool to replace a broken one.

Yeap, the government is really tight on things depending on the need but to quote G-Man "The right man in the right place can make all the difference.". We had a guy who did our procurement and he was able to get orders passed through and delivery within weeks of me requesting any supplies, which is incredibly fast by government standards. He has since retired and whoever is currently in charge sits on paperwork like a phone bill. We actually had the internet to our office of 400 employees disconnected because someone forgot to pay it for the month...

To contrast, in the private sector if you need something you order that shit expedited and its at your office within 2-3 days because people are paid to do their job because time is money to them.
 
Yeap, the government is really tight on things depending on the need but to quote G-Man "The right man in the right place can make all the difference.". We had a guy who did our procurement and he was able to get orders passed through and delivery within weeks of me requesting any supplies, which is incredibly fast by government standards. He has since retired and whoever is currently in charge sits on paperwork like a phone bill. We actually had the internet to our office of 400 employees disconnected because someone forgot to pay it for the month...

To contrast, in the private sector if you need something you order that shit expedited and its at your office within 2-3 days because people are paid to do their job because time is money to them.

While many procurements can take a long time, there are definitely ways to expedite purchases, especially if the purchase is for something small. To replace a smaller item like a power tool it is usually super easy. The amount or the total cost of the item you need to procure will determine who you need to purchase it for you. Also some contracts leave the purchasing completely up to the contract company and sometimes that helps ease the procurement process immensely. So there really are a number of different situations that can help.
 
While many procurements can take a long time, there are definitely ways to expedite purchases, especially if the purchase is for something small. To replace a smaller item like a power tool it is usually super easy. The amount or the total cost of the item you need to procure will determine who you need to purchase it for you. Also some contracts leave the purchasing completely up to the contract company and sometimes that helps ease the procurement process immensely. So there really are a number of different situations that can help.

Small stuff yes, our supply room has no issues getting pens, paper and more but in my section I do litigation graphics and tech. The stuff I order like ink, foam boards, designjet paper rolls, etc is usually in the $500-$3K range so it takes more approval than most. I usually get the approval quickly, it's the people in procurement taking their sweet time on passing the paperwork through and calling the vendors. They did away with contractors in that section (which I think worked faster) and gave them all government positions. Also we use GSA approved vendors as often as possible. They are much slower than places like Staples or Newegg.
 
Small stuff yes, our supply room has no issues getting pens, paper and more but in my section I do litigation graphics and tech. The stuff I order like ink, foam boards, designjet paper rolls, etc is usually in the $500-$3K range so it takes more approval than most. I usually get the approval quickly, it's the people in procurement taking their sweet time on passing the paperwork through and calling the vendors. They did away with contractors in that section (which I think worked faster) and gave them all government positions. Also we use GSA approved vendors as often as possible. They are much slower than places like Staples or Newegg.

Like I said it is relative. I was on a project where ordering a $25k item was just something we put on a government credit card and didn't take anytime at all. Every government representative has a limit they can go to on their card and conditions on what they can use it for. If you setup those limits and conditions properly at the beginning of the program then ordering the things you need for the program usually isn't much of an issue. Now, getting some of those items into the lab, now that could be an issue heh.
 
No doubt, my customer pays for excellent service from NetApp and if I lose a hard drive NetApp sends a replacement that arrives next day air. Then the drive sits at another building on someone's desk for two to three days for them to get it to my building, where it again sits waiting for Security to scan it, even though they know they can't actually scan a SAS drive, (they don't have the equipment and haven't ordered it), and then it goes to CM where the drive is catalogued, labeled, and transferred to the media library, where now, after 5 days, I can actually sign out that drive and replace it hopefully before another drive fails and risks loosing the RAID group.

It can be maddening and all I can do is make sure I have documented my actions properly so my company and I stay out of hot water if it all goes south.
 
No matter what OS you use, there are going to be vulnerabilities and in fact there have been some very serious vulnerabilities in packages that come with Linux software in the past 2 years. Those vulnerabilities have affected a great number of devices including servers, special project builds, networking equipment, analysis equipment, etc. So Windows is not in fact the least secure option. Those of us who work in the mission critical areas of the government are well aware of the issues we face everyday with our applications, but just saying you are going to use a different platform does not inherently make the project any safer.

Yes and no. I have also worked on government systems and most of them were Unix based running Oracle or IBM stuff. On many occasions efforts were made to move I&C to Windows platforms. But the initial out of the box experience and the difference in security and stability used to be quite evident. I don't know how it is now because I don't do I&C anymore but back then Windows was not in anyway shape or form an equal to Unix or Linux for that matter. Clustering services in particular for Windows 2003 used to be a joke. If you've never seen the Windows Clustering Service fail just pray that you never do. One of the biggest reasons for cost overruns was because of stability and the testing required before production. Now did we eventually get to a point where the Windows based system was just as secure and stable? Yes. But it required systems to be in testing for quite some time before switching over, and at least additional supervision for the first couple of months in production. This process usually is what caused other departments to just decide to stick with what they had and enter into agreements to guarantee replacement parts.
 
Security through obscurity.
EDIT: At least with the floppy discs. Server 2003 is a no-no.
Don't forget security through mainstream obsolescence!
The only ones who will be able to fix that kind of tech will be retrocomputing enthusiasts. :cool:
 
Yes and no. I have also worked on government systems and most of them were Unix based running Oracle or IBM stuff. On many occasions efforts were made to move I&C to Windows platforms. But the initial out of the box experience and the difference in security and stability used to be quite evident. I don't know how it is now because I don't do I&C anymore but back then Windows was not in anyway shape or form an equal to Unix or Linux for that matter. Clustering services in particular for Windows 2003 used to be a joke. If you've never seen the Windows Clustering Service fail just pray that you never do. One of the biggest reasons for cost overruns was because of stability and the testing required before production. Now did we eventually get to a point where the Windows based system was just as secure and stable? Yes. But it required systems to be in testing for quite some time before switching over, and at least additional supervision for the first couple of months in production. This process usually is what caused other departments to just decide to stick with what they had and enter into agreements to guarantee replacement parts.

What you are saying here is true for any system. When switching from one type of platform to another when you already have programs developed there will always be testing and modification that will need to be performed. I am not really sure what your point here was?
 
Again, you are making statements that are just completely untrue. Closed source does not mean you cannot be sure of the security of your system. There are all sorts of tools that can be used to determine that. Also, I don't think you really understand how enterprise's do their updates and system control. The Windows updates are not really a mess at all, I am not even sure where you are getting that from.

Using open source you cannot audit literally every bit of code.

B00nie is right in saying you could never "do an audit" of all of the code running in your system.
And you are right in saying not everyone would be able to audit code they don't understand.
So you are both onto something.
But if this was a super critical project, you can hire a kernel guy and he will probably, in limited time, be able to review everything that runs on your metal.
Also, big established projects like Debian have their kernel people too, and while everyone can inject a backdoor, everyone can find it, too.
Linux was kind of in the forefront of virtualisation with 'chroot'. Or "jails". That is, every instance of a service daemon had its own filesystem and it could have never accessed the global password file because it didn't even know of its existance, as it was beyond "/".
It doesn't make it unhackable, but it is some great damage control.
Also, open source has the advantage of publishing not only code, but also higher level design principles and these are usually telltale of where to look for holes.

B00nie, Do you know that the DoD and most of the Federal Government for that matter, don't do MS Updates from Microsoft?

For example, the DoD uses their own customized OS builds, they were called Army Gold Master, (AGM), and I think they are now called UGM, (Unified Gold Master or something like that). And as such, the Army looks over MS patches and decides what they will server for updates, which updates they will not serve, and which ones they will find other ways to mitigate. Now I am not saying that they have never pushed out a bad update but it's not a mess like you claim. They just don't take Microsoft's products "at their word" and trust that they are working properly. They have teams that test those patches against the current deployment builds and tests the deployment of the updates as well.

It just sounds like you are unaware of how things really are with the DoD and much of the Government's IT in general. That being said, there are parts of the government which are not up to these same standards.

Wow, I was applying for a position in our army one time and the guy had to jump through hoops during the interview because he was unauthorized to disclose this (quite innocent) level of detail :D
 
B00nie is right in saying you could never "do an audit" of all of the code running in your system.
And you are right in saying not everyone would be able to audit code they don't understand.
So you are both onto something.
But if this was a super critical project, you can hire a kernel guy and he will probably, in limited time, be able to review everything that runs on your metal.
Also, big established projects like Debian have their kernel people too, and while everyone can inject a backdoor, everyone can find it, too.
Linux was kind of in the forefront of virtualisation with 'chroot'. Or "jails". That is, every instance of a service daemon had its own filesystem and it could have never accessed the global password file because it didn't even know of its existance, as it was beyond "/".
It doesn't make it unhackable, but it is some great damage control.
Also, open source has the advantage of publishing not only code, but also higher level design principles and these are usually telltale of where to look for holes.



Wow, I was applying for a position in our army one time and the guy had to jump through hoops during the interview because he was unauthorized to disclose this (quite innocent) level of detail :D

Some people take the "rules" too far. They act as if they are bound so tightly that they can't say things that are not actually protected information. At the same time, they aren't supposed to talk around things either. Sometimes I think trying to talk around something is worse then just coming out and saying it plainly and simply without embellishment. Trying not to step in the puddle just means sooner or later your going to get everything wet.

And chroot was originally a Unix command incorporated into Linux.

The chroot system call was introduced during development of Version 7 Unix in 1979, and added to BSD by Bill Joy on 18 March 1982
chroot - Wikipedia, the free encyclopedia

And I think it was a smart move on Joy's part, it's very powerful.
 
Honestly it doesn't surprise me at all. I got out of the Navy 3 years ago and we were still using server 2003 for most of the ship and squadron servers and had just that year moved off XP. A great many software applications we used still required dos and floppy drives.
 
Back
Top