21

Spooktacularly Scary Database Stories

 4 years ago
source link: https://www.tuicool.com/articles/IV3Qriz
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

ayy2I3M.png!web The nights are lengthening and the spookiest day of the year is nearly upon us, Halloween! In the spirit of the holiday, we asked our team to share their scariest tales of database dread, blood-curdling BIOS failures, and dastardly data destruction, and some of the responses are downright chilling.

Grab some candy and check out the stories that Perconians are too afraid to talk about after dark!

Brian Walters, Director of Solution Engineering:

Rookie DBA at a company with a shoestring budget = dev, test, and prod on the same server. What could possibly go wrong?
So, I’m about two years into my career as a DBA and part of my job is maintaining the database for the company MRP system. This system is critical, without it the whole plant shuts down.
During the implementation of the MRP system, the company had fallen into the bad practice of using the production database server for development and testing purposes as well. I’d made several requests for dedicated dev/test hardware, but this just didn’t seem to be a priority for those controlling the budget.
My days usually started with the same few tasks: checking the backups from the night before, deleting the prior day’s testing database, and creating a new testing environment by restoring last night’s backup. I had my routine pretty tight, most of it was scripted. All I had to do was change an environment variable and run the daily scripts.
This all worked fine until one day… that morning, I was just a little more tired than normal. I logged into the database server and thought that I had set the environment variables so they pointed to the testing database. By some mistaken force of habit, mental lapse, or temporary spooky hallucination, I had actually, accidentally set the environment variables to prod… and then I ran the delete-database scripts.
Somehow, I realized my mistake almost before the enter-key was fully depressed. But by that time it was already too late. It took less than ten seconds for the first phone call to arrive. Naturally, I sent it to voicemail, along with the next three phone calls that immediately followed. My next instinct was to press the Send All Calls button. Head in my hands, fully realizing what had just happened, I struggled to bring myself to understand how I did it.
After a quick conversation with my boss, who also happened to be the CFO, my schedule was cleared for the day. The remainder of the morning was spent practicing my Point in Time Recovery skills. To this day, I am grateful that I had a solid and tested backup and restore plan. I went home early that day. And that was the last time I made a mistake like that. We purchased dedicated dev/test hardware about a month later.

Robert Bernier, PostgreSQL Consultant

I was working as a newly hired Senior DBA when two developers suddenly appeared at my desk in a very agitated and excited state of mind. They were upset because they had accidentally pushed OS updates to embedded devices that were installed on many of our client devices. These premature updates effectively bricked them, which numbered in the tens of thousands.

I suppose I shouldn’t have been surprised but they actually “demanded” that I execute a rollback at once. Needless to say, being a new hire, there were a lot of things I still didn’t know and I was anxious to avoid making the situation worse. So I invited them to find a couple of chairs, sit next to me and taking their time “explain” who they were and what they did at the company. Eventually, I got around to the events leading up to the incident. Slowing them down was my primary concern as the rollback’s success hinged their state of mind. In time, they were able to demonstrate the issue and its resolution. Within a couple of hours, we staged the rollback across the affected devices and unbricked them.

In retrospect it could have been worse as the data-store I was managing held several hundred TB representing 600,000 embedded devices.

The moral of the story, heh heh… always know where to find your towel .

Audrey Swagerty, Customer Success Manager

This is not a tech story but a real nightmare :wink:

When I started as a CSR, 3.5 years ago, I was not familiar with our industry and the technologies so it was quite the challenge… for some time (I guess until I got more comfortable with my new job), I used to have a recurring nightmare. I was being chased through the woods by a girl… and she would finally catch up with me, grab me and I would ask her name (don’t ask me why…instead of asking her not to kill me right )… And she would say: I am MongoDB !!! Then, I would wake up!

I have not thought about that story in a long time (and have not had the same nightmare ever since) so it was fun to share it again with you! Hopefully, it won’t happen again this weekend… I have been trying to help a customer with Kubernetes questions so you never know! :joy:

Marcos Albe, Principal Technical Services Engineer

The worst horror story is a hospital introducing inconsistencies into a database… I always feared someone with asthma would end up as an amputee due to broken data!

Patrick Birch, Senior Technical Writer

While I was a SQL Server DBA, my CTO allowed the report writers to access the production database. The report writers were very nice people but were lacking in certain qualities, such as the ability to write SQL code. I wrote their queries and asked them to run their requirements through me. Well, I went to lunch one day, and when I returned the CTO and other managers were running around. One of the report writers had written a cartesian join (cross join) and the production database was slowing everything down to a crawl!
I killed the connection and had a long talk with the report writers. The managers approved my building a data warehouse the next day.
 

Martin James, Vice President of Sales EMEA & APAC

At my last company, I started to look at the scary reality of data security and healthcare. As mists and mellow fruitfulness provide the perfect backdrop to the spooky traditions of Halloween, ghostly goings-on were being uncovered in unexpected areas. Not in gloomy churchyards nor crumbling manor houses, but in the databases of general practitioners in England.

In 2016, the data of 57 million patients were held within general practitioners’ records. However, census data at the time suggested this should have stood at only 54 million. So who are these extra 3 million people? These records belong to ‘ghost patients’: patients who have deceased or emigrated or are duplications/inaccuracies in record keeping. Either way, it has an impact on surgeries, on funding, and on the services provided, adding unnecessary pressure to the NHS and leaving it unable to provide a precise picture of its patients and the care it provides.

So, NHS England began a ghost hunt to track down and identify the owners of these records so they can update their data and save money. If a patient hasn’t seen their GP for five years, they’ll be sent a letter requesting them to respond. If they don’t respond, they’ll be sent a second letter, after which they’ll be removed from the patient register. This could be a measure that makes an instant saving, as, according to the BBC , family doctors are paid for every patient registered on their list (ghost or not) and the Times quantifies this at around £400 million a year.

Taking a deeper dive into patient data and the connections within it could be used to great benefit in the health service. It would enable a tighter hold on data and driving compliance, as well as help the NHS improve precision and accuracy in its records. Its data will become an asset, and a means of national health intelligence. A utopian view? Perhaps – but without the need for ghostbusters!

What’s your scariest database story?Let us know in the comments or reply to us on social so we can be on the lookout for database ghouls and goblins! And to learn how Percona’s experts can take the scary out of database management, check out our open source database support , managed services , and consulting services.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK