Sawing Linux Logs with Simple Tools

    Date24 Sep 2004
    11868
    Posted ByAnthony Pell
    So there you are with all of your Linux servers humming along happily. You have tested, tweaked, and configured until they are performing at their peak of perfection. Users are hardly whining at all. Life is good. . . . So there you are with all of your Linux servers humming along happily. You have tested, tweaked, and configured until they are performing at their peak of perfection. Users are hardly whining at all. Life is good. You may relax and indulge in some nice, relaxing rounds of TuxKart. After all, you earned it.

    Except for one little remaining chore: monitoring your log files. [insert horrible alarming music of your choice here.] You're conscientious, so you know you can't just ignore the logs until there's a problem, especially for public services like Web and mail. Somewhere up in the pointy-haired suites, they may even be plotting to require you to track and analyze all sorts of server statistics.

    Not to worry, for there are many ways to implement data reduction, which is what log parsing is all about. You want to slice and dice your logs to present only the data you're interested in viewing. Unless you wish to devote your entire life to manually analyzing log files. Even if you only pay attention to logfiles when you're debugging a problem, having some tools to weed out the noise is helpful.

    The simplest method is a keyword search. Suppose you want to separate out the 404 errors in your Apache log, and see if you have any missing files:

    $ grep 404 bratgrrl.com-Aug-2004
    ...
    212.27.41.34 - - [30/Aug/2004:02:25:13 -0700] "GET /robots.txt HTTP/1.0" 404 - "-"

    Pompos/1.3 http://dir.com/pompos.html"
    65.54.188.90 - - [30/Aug/2004:10:32:26 -0700] "GET /robots.txt HTTP/1.0" 404 - "-"
    "msnbot/0.11 ( +http://search.msn.com/msnbot.htm)"
    207.65.113.58 - - [12/Aug/2004:06:49:11 -0700] "GET /favicon.ico HTTP/1.1" 404 - "-"
    "Opera/7.21 (X11; Linux i686; U) [en]"
    ...

    These entries are typical. This site has no

    robots.txt

    or favicon, so any requests for these files generate a 404 error. The first two are Web bots. The third entry is probably some random surfer. You can ignore these. So let's screen out

    You are not authorised to post comments.

    Comments powered by CComment

    LinuxSecurity Poll

    Do you read our distribution advisories on a regular basis?

    No answer selected. Please try again.
    Please select either existing option or enter your own, however not both.
    Please select minimum 0 answer(s) and maximum 3 answer(s).
    /component/communitypolls/?task=poll.vote&format=json
    23
    radio
    [{"id":"84","title":"Yes, for a single distribution","votes":"0","type":"x","order":"1","pct":0,"resources":[]},{"id":"85","title":"Yes, for multiple distributions","votes":"6","type":"x","order":"2","pct":60,"resources":[]},{"id":"86","title":"No","votes":"4","type":"x","order":"3","pct":40,"resources":[]}]["#ff5b00","#4ac0f2","#b80028","#eef66c","#60bb22","#b96a9a","#62c2cc"]["rgba(255,91,0,0.7)","rgba(74,192,242,0.7)","rgba(184,0,40,0.7)","rgba(238,246,108,0.7)","rgba(96,187,34,0.7)","rgba(185,106,154,0.7)","rgba(98,194,204,0.7)"]350
    bottom200

    We use cookies to provide and improve our services. By using our site, you consent to our Cookie Policy.