2

Writing

 3 years ago
source link: http://rachelbythebay.com/w/2013/03/16/greplogs/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Writing tools even when "Unix glue" would do

One of the many dumb little tools I made while working web hosting tech support was something called "greplogs". It was designed to offer quick answers to the questions customers had far too often. These were questions like "why did I have a bandwidth overage last month?" and "what's the busiest site on my server?" and even "how do you know your bandwidth monitoring is accurate?"...

Initially, all it did was grovel around in the usual boring log format emitted by Apache (and similar tools). It would add up either hits or bytes and would give you a sum at the end. It was a little like 'wc' in that sense, only it understood the "field" nature of the logs in question, and would handle byte counts appropriately.

In other words, if there was a hit for foo.jpg which is 500000 bytes and another hit for bar.jpg which is 250000 bytes, it would report 2 hits and 750000 bytes of transfer. It had just enough domain knowledge about those often-tricky ASCII logs to be useful.

Of course, it didn't end there. Someone found they needed to whittle down the input data. Normally that would be a job for grep, but since all of this stuff was happening inside my program, I had to add an argument to let you specify expressions. Then you could say "-e foo" and it would only add up all of the hits matching "foo". This came in handy for only a subset of the traffic in a log was interesting.

This made people pretty happy, but later on I got a request to make it handle the "xferlog" format from whatever FTP daemon was common on those boxes at the time. The same basic ideas prevailed, and sooner or later I had something which could also digest that format and emit similar results, along with the same expression-based filtering as before.

I don't think I ever got to the point of having it chew on mail logs, like those emitted by sendmail or (gasp) qmail. Having a message count and a cumulative byte count probably would have been useful in a few situations, but it was well outside the 80-90% of cases which I had knocked out with just web server and FTP logs.

One idea I had but never acted on was writing a helper tool which would grovel through the Apache config files and would follow things like "Include" directives. It would do this traversal to identify all possible log files. Then you could apply a filter to that list to get a subset. That list of logs could then be fed to the first tool in order to get counts from them.

In this sense, you could have a customer on the phone asking about hit counts for www.foo.com, and instead of trying to figure out where he hid it, you could just "greplogs --hits `findlogs www.foo.com`" and let it run.

Obviously, yes, you could do all of this with various combinations of tools like find (or even locate), grep, and awk. Now, try explaining how to build a multi-program pipeline and the awk language to someone who still right-clicks and selects "copy" in their terminal while on a Linux box running X (hint: select and middle-button!). You could do that, or you could just point them at this little binary and let them run queries, confident that it's locating and parsing things correctly.

When there's a never-ending stream of tickets, phone calls, and customers who want results now, with little time to actually train anyone, the all-in-one tool tends to win over the Unix glue approach. I'm not really proud of this, but that's the way things worked out.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK