Atlanta Hit by Ransomware Attack

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,552
Once again those pesky hackers have conned someone into clicking a link they shouldn't have. This time the City of Atlanta is the target and some of their city data has been encrypted. At this time it hasn't affected public safety, the airport, or water department. It just goes to show no matter what that some people just can't be trusted to use computers.

The city is working with federal agencies, including the FBI and the Department of Homeland Security, as well as private sector partners, to determine the cause of the outage and to fix the problem, Bottoms said. When asked if the city would pay a ransom to resolve the problem, Bottoms said the city would seek guidance from federal authorities on the best course of action.
 
Personally I feel there has not been enough effort expended to investigate and find the fuckwits who perpetrate this sort of attack and turn them into pieces of novelty furniture.

It's true that education of the user does come into it, but then let's not hate on the poor old granny who gets conned on the phone into giving away obscene amounts of cash, for example.
 
Personally I feel there has not been enough effort expended to investigate and find the fuckwits who perpetrate this sort of attack and turn them into pieces of novelty furniture.

Agreed, but these cases are hard to track down, these guys tend to cover their tracks well.
 
When are people going to learn? Systems with critical data need isolated and backed up.

If the it manager cannot recover from backups they should be fired.
 
When are people going to learn? Systems with critical data need isolated and backed up.

If the it manager cannot recover from backups they should be fired.

Someone is about to make some cash implementing a backup system of some nature.
 
If the it manager cannot recover from backups they should be fired.


more often then not, it is NOT the fault of IT, but those in management/bean counting, that cannot see the financial need for such an expenditure, until its too late.

I cannot tell you how often I have laid out very specific plans to backup/secure networks/systems/data to have management deny it due solely as an "unjustified expense" To have then be proven right.. AFTER the fact. And yet, I was held to blame for not taking action...

yeah how about a giant F U



EDIT: the F U is not meant to be to who i replied.. but to those in the above scenarios LOL
 
Last edited:
more often then not, it is NOT the fault of IT, but those in management/bean counting, that cannot see the financial need for such an expenditure, until its too late.

I cannot tell you how often I have laid out very specific plans to backup/secure networks/systems/data to have management deny it due solely as an "unjustified expense" To have then be proven right.. AFTER the fact. And yet, I was held to blame for not taking action...

yeah how about a giant F U



EDIT: the F U is not meant to be to who i replied.. but to those in the above scenarios LOL
I hope you were able to lay out the steps you tried to take to CYA, lol! I have seen similar things happen in the past, they either don't trust what they're being told (I saw a company bring in a consultant that told them exactly the same thing their IT manager told them) or don't believe it will happen to them.
 
more often then not, it is NOT the fault of IT, but those in management/bean counting, that cannot see the financial need for such an expenditure, until its too late.

I cannot tell you how often I have laid out very specific plans to backup/secure networks/systems/data to have management deny it due solely as an "unjustified expense" To have then be proven right.. AFTER the fact. And yet, I was held to blame for not taking action...

yeah how about a giant F U



EDIT: the F U is not meant to be to who i replied.. but to those in the above scenarios LOL

Our code repository was on a server that has been in use since 2005. IT KNEW it was failing on the RAID and they last time they did a backup was six months earlier. Let me repeat that for emphasis "IT knew it was failing and there were no backups". Well guess what happened? If IT knows and they don't make the need known the criticality for a simple $1000 backup system, then that's on them. That little snafu cost the company at least $150K in lost productivity as the department tried to merge whatever source they had on their personal laptops as well as lost code history. We're talking terabytes of source + data sheets. Luckily everything was saved. We now have a backup in place, but jesus what a major F-up. I would have someone's head if I was in charge.
 
Our code repository was on a server that has been in use since 2005. IT KNEW it was failing on the RAID and they last time they did a backup was six months earlier. Let me repeat that for emphasis "IT knew it was failing and there were no backups". Well guess what happened? If IT knows and they don't make the need known the criticality for a simple $1000 backup system, then that's on them. That little snafu cost the company at least $150K in lost productivity as the department tried to merge whatever source they had on their personal laptops as well as lost code history. We're talking terabytes of source + data sheets. Luckily everything was saved. We now have a backup in place, but jesus what a major F-up. I would have someone's head if I was in charge.

Sadly stories like this are the rule and not the exception it seems. I've got plenty of similar stories from my 20+ years in IT.
 
Atlanta will likely be implementing tanium and crowdstrike soon. They will also need to wipe and restore their dc's, reset kerbos, reset all ad passwords. They should be on the lookout for petya malware too. Likely the malicious actors were in the systems before they actually dropped the malware. Could potentially be different actors as well. They need to search for instances of psexec and the compromised domain admin acct they got ahold of.

It is also unlikely this was spread via email. They likely got in via a compromised system and elevated from there. Once they could get to a DC, they were likely able to push out to other systems. Atlanta needs to completely shut down all network traffic, and begin a system wide port validation protocol. Infected systems need to be immediately isolated from the network. They also need to reach out to their av vendor for the latest DAT files. If they can grab a box that has the virus, but hasn' been able to reach out to its host to begin the encryption process, they can likely grab the malware files and decompile them. That should give them the specific ip's the malware reaches out to for the private key exchange.
 
Last edited:
Atlanta will likely be implementing tanium and crowdstrike soon. They will also need to wipe and restore their dc's, reset kerbos, reset all ad passwords. They should be on the lookout for petya malware too. Likely the malicious actors were in the systems before they actually dropped the malware. Could potentially be different actors as well. They need to search for instances of psexec and the compromised domain admin acct they got ahold of.

It is also unlikely this was spread via email. They likely got in via a compromised system and elevated from there. Once they could get to a DC, they were likely able to push out to other systems. Atlanta needs to completely shut down all network traffic, and begin a system wide port validation protocol. Infected systems need to be immediately isolated from the network. They also need to reach out to their av vendor for the latest DAT files. If they can grab a box that has the virus, but hasn' been able to reach out to its host to begin the encryption process, they can likely grab the malware files and decompile them. That should give them the specific ip's the malware reaches out to for the private key exchange.

Tanium is an EPIC pain in the tail for software dev and processes which do a large amount of file access.
 
Atlanta will likely be implementing tanium and crowdstrike soon. They will also need to wipe and restore their dc's, reset kerbos, reset all ad passwords. They should be on the lookout for petya malware too. Likely the malicious actors were in the systems before they actually dropped the malware. Could potentially be different actors as well. They need to search for instances of psexec and the compromised domain admin acct they got ahold of.

It is also unlikely this was spread via email. They likely got in via a compromised system and elevated from there. Once they could get to a DC, they were likely able to push out to other systems. Atlanta needs to completely shut down all network traffic, and begin a system wide port validation protocol. Infected systems need to be immediately isolated from the network. They also need to reach out to their av vendor for the latest DAT files. If they can grab a box that has the virus, but hasn' been able to reach out to its host to begin the encryption process, they can likely grab the malware files and decompile them. That should give them the specific ip's the malware reaches out to for the private key exchange.

hahahahahaha!!! They are going to blame the Sysadmin who has been asking to buy a new server for 5 years to replace the 2003 machine that has been overloaded since it was put in.
 
Wish the US government would allocate the resources to tracking just one of theses ransomware distributing fucks down and forcefully abducting them and their loved one from whatever slavic shithole they are from and the publicly executing their children on live TV to make a point. Then put the people of world on notice... hey if you do this we will kill your loved ones.

Think you country will protect you... wrong.. think we won't find you... wrong think our response will be proportional and fair nope Were the USA and you fuck with us we burn you alive you worthless filth.
 
The usa government has developed and distributed plenty of malware. I went to a community college were some idiot answered a phishing email and sent everyone who worked there's W-2 which for some reason the college kept together in unecrypted word file. The dumbass emailed this file and 1300 employees never got their tax return. The person who leaked the W-2s wasn't fired and the whole staff had to go to "training".

Security is a joke at most places, it's difficult to develop a plan and stick to it for some.
 
Some of my co-workers believe the most basic of phishing emails. If my co-workers are representative of the average worker out there, then we have serious problems. It's not surprising that these hacking attempts are periodically successful.
 
Back
Top