Why Website Owners Need To Make Server Logs Their New Best Friend

The number one way to make sure that your site is secured is to constantly keep track of it.

If you check your site everyday and see what needs to be improved upon then you will most likely run into the sections of the site that are unsafe and are prone to attack.

There are a lot of people who create a web site and will then not check the underlining code underneath it for months at a time, even if they are the ones who created the code.

As a webmaster you should always be on the lookout for any part of your site that might not be up to par but the code on your site is not the only thing that you should be looking out for – attacks can happen at the server level of the web site as well.

Therefore, you must make sure that you are able to stop attacks from happening there while at the same time looking out for other portions of the site.

Server Logs, Your New Best Friend

There are several steps that you can do to prevent this but a good first step is to check out your server logs.

The server logs are a part of the web site that gives you information about the activity of the actual server.

As a webmaster you may be used to obsessing over checking the traffic and the activity of the people who come to your web site.

But checking the actual server logs is a little bit different.

You are checking to see the actual activity going on on your server – what functions are being called, who is logging in, what accounts are being used, things of that nature.

Checking the logs for this type of information can easily be used to see if anyone is doing anything on your computer set up that they are not supposed to be doing.

For some people, checking the server logs may be a confusing experience.

Even if you are a developer, you might not be used to some of the lingo that the server uses.

This might be especially true if you are a Windows developer and you are using a Linux server to host your site.

In this case you could either pay someone else to check your server logs, ask the web host can they check your logs for you, or go to a web site that will explain what the server is trying to tell you.

Once you start to study for a little while, it will not be hard for you to figure out how to be able to read the information that is in front of you.

If you want to make sure that you have a secured computing experience then checking your server logs is a good way to do this.

If you own a web site then it is a good idea to try and learn some of the basics yourself.

You can also read the 13 things every webmaster needs to know about website security.

About Lee Munson

Lee's non-technical background allows him to write about internet security in a clear way that is understandable to both IT professionals and people just like you who need simple answers to your security questions.


  1. If you are going to do searches, yes you need good security software, no doubt about that, and add a little common sense in there with it.

    • Common sense – amazing how many people don’t apply what is undoubtedly the most important element of internet security.

  2. After reading this article it made me think of something that i have always wondered about. When you do a search for a specific subject you seem to always come up with thousands of webpages on the subject. On reviewing a large number of those websites a user quickly notices a large number that are stagnant, no longer being kept up by the person or persons who put the site up. Sometimes if they have a date on them you may notice that pages may not have been updated for years.
    Just how many of these websites are ones a user shouldnt visit.
    Even more interesting is why are they even still up, a lot of these sites are spin offs of the original website where the info has been copied an reposted.
    Its to bad the internet isnt cleaned up every now an then an a bunch of unused sites taken down.

    • Hmmmm, it seems that what you have observed is counter to how search engine results pages should work.

      My understanding is that (generally) the first results that you will see are the web pages that have been linked to the most which would imply that they contain the most pertinent information with regards to your query.

      Also, where people copy pages something called a duplicate content filter comes into play where (in theory) only the original source of that content gets ranked by the search engine.

      If that is not what you are seeing then it just goes to show how much work the search engines have to do still in order to give their customers what they are looking for.

      As to your last point, didn’t you know that the internet is sometimes closed down for cleaning purposes???

      • Yes i realize that the search engines should pull up the webpages linked to the most first, but when you see 300,000 pages on the same subject an go prowling thru the last several thousand of those its a mess, some with dates on them many years old.

        I didnt know they closed it down for cleanup, have to read up on that.
        Seems a better cleaning is needed, free up some space before we run out, if thats possible, an i think it is in some ways.

        • Just how deep down into the results do you go Dave?

          (Most people only look at the first 3 results I believe)

          One observation of mine though is that even though many pages on a topic look different they are in fact not – I think a lot of webmasters simple rewrite each others’ content, mores the pity.

          • I too think a lot of stuff is just rewritten.
            How deep,, i have done searchs on somethings an gone 20 pages or more into the search results just to see whats there.
            Today i wonder if its not better to go beyond the first page of any search results, but then i do look at the website address before i go clicking.

          • I actually think that the first pages of results are better now than they have been in the past.

            In days gone by people used to be able to rank well just by stuffing keywords into their pages and meta tags.

            Nowadays that doesn’t work and an increase in competition in just about every niche means that quality content seems to rise to the top more often.

            Of course that doesn’t mean to say that I don’t see some junk and there is, unfortunately, a risk of landing on a malware-infested page too but thats what good security software is for, right?


  1. […] These logs are created so that the person who is running the computer can easily pin point when something went wrong. […]

  2. […] This post was mentioned on Twitter by Dave, Lee. Lee said: Why Website Owners Need To Make Server Logs Their New Best Friend http://bit.ly/asNpRP […]

Speak Your Mind