Literally who ever said “backups are overrated”?
The people who laid off 85% of the IT dept.
My ex company had for more than 10 years keept all the data customers shared with us. Structured and standardized, should have been easy peasy.
Somehow they were “appending wrong” in some way and data was useless. In think they were trying to reduce the size by aggregating a bit, but they did in a way that rendered the data useless.
Of course the CEO wanted to train models with it anyway…
10 years and no one bothered to pull some information at random? I mean generally companies have a schedule of assessments to ensure records. Even if it’s as simple as checksum.
The thing is they had data that expected to be slightly aggregated, do not a 1:1. The problem comes when you try to use the data for analysis and realize it didn’t make any sense
I like train models
Everything is fine. Why would you need a backup?
Penny counters who don’t like paying for storage
Backups are always considered to be too expensive up until the point that not having backups becomes more expensive. This applies to redundancy of all kinds except the one that means firing employees for not setting up the other kinds.
Even at home. I do one usb backup and one internal backup of photos, home videos and documents. I would love to make backups of other stuff, but I can replace a lot of the other crap if need be, because hard drives kinda stalled in price drops.
Apparently someone in the South Korean goverment xD
I say that often. Especially after I forget to back something up and lose it.
I doubt anyone said it verbatim, but it happens that they’re deemed lower priority ad infinitum.
So I have lived in South Korea for 6 years now. The fact that this fire has had such a major impact is quite typical of Korean bureaucracy and tech administration. Very few backups, infrastructure held together with scotch tape and bubblegum, overworked devs and maintainers. It’s a bit sad, especially for a country that exports so many tech products.
If I had a nickel for every time someone didn’t backup their datacenter, I’d have two nickels.
Which isn’t a lot, but it’s weird that it happened at least twice.
Last time we lost disks at work, there were full backups.
They were just in the same disks as the data. And because everything is abstracted two times into virtual disks on virtual machines, and containers and volumes, the people responsible for the backups didn’t even know it.
But won’t you like…check? That the backups are own their own drive? The whole 3-2-1 rule kinda make you want to check this, no?
Or was it like they knew where the drives of the backups were, but they didn’t know those drives were being virtualized away and were in like production use?
I dunno what possibilities they actually had. But knowing the place, I can fully believe both that they weren’t allowed to check and that they never bothered.
The most likely scenario in my head was that they sent a request to the provisioning team asking for the volume to be in a different disk, and that detail never made into the technician actually doing the work (that sits on the next chair, but the requests have to come from the system).
(And the long term backups were fine. We lost 3 days of data.)
Maybe they stored backups on-site
You met me at a very strange time in my life.
♫♫ Where Is My Mind? ♫♫ plays
In the background, explosions rip through buildings and they start to collapse.
dies after fall
Well, if that ain’t a whitewashed headline.
I collect stories like this for when I need to make a case for purchasing new gear or services.
You can’t lose computerised services if you don’t have computerised services. Checkmate SysAdmins.
In Lithuania, healthcare e-services went down after the basement where the servers were kept got flooded in a rainstorm. They went down for a couple of weeks.







