Analyzing traffic statistics
Why is everyone worried about the amount of traffic?
Traffic is at the heart of the Internet. Traffic is what makes the Internet so alive and kicking. An Internet without all these people (you and me included) searching for information, products and services would be like the oceans without the water. Traffic is the Internet.
This wonderful, miraculous thing that has changed the lives of many by providing such an abundant amount of information fast and easy to find, requires money to exist though.
The web becomes increasingly commercial as the years pass by. The educational purposes and idealistic concepts that were so closely tied to the Internet lost most of the ground in their fight with the mighty dollar. The ‘net is no longer a place where the geeks of the day share their knowledge in an Utopian fraternity.
Today’s Internet is a savage place where the bigger fish eats the smaller fish, where monopolies are starting to emerge, a place that requires new laws that are created along the way, by which time some people do get hurt.
It’s a lot like the wild west from a few points of view, but the gold rush is now the traffic rush. As in those day gold meant money and was the subject of the day, nowadays, web traffic means money and is on the lips of every website owner on the planet.
What’s all this got to do with web hosting?
Well, as a matter of fact, for most people this has everything to do with hosting. Why? Because the server that hosts your website knows exactly how many pages it served, when they were served and many details about the person that requested them.
The web server can and will, if instructed to, save this kind of data in a what is commonly known as a raw logs file. The data in these logs files is written in a certain format, according to some standards.
The raw logs files are basically a text file which can be viewed with an application such as Notepad. I reckon any person can understand at least partially the information that is stored in such a file just by viewing it.
Raw logs are a standard feature in the hosting packages now, but you should make sure nevertheless. Raw logs are important if you plan to use advanced client based software to analyze the traffic your website receives.
Analog, Webalizer and Awstats
For smaller sites, web based software such as Analog, Webalizer or Awstats are usually enough. These programs are run on the server and they are most of the time included in the hosting package. If you don’t plan to use your own software to analyze the logs, look for a host that offers these in the package you’re planning to buy. Note that these are common feature; almost all hosts include them in the package because they’re freeware, so not offering at least one of them is rather strange.
Analog is the most basic of the three, but it’s quite popular — perhaps because it’s been around for a while. Webalizer offers similar information but in a different format and somewhat more detailed. Awstats is quite nice and it’s the better looking of the three.
All three analyzers can help you find out new things about your visitors. You’ll not go "WOW!" when using them, but you’ll know a thing or two about your visitors nevertheless. Just so you can get an idea of how they look like, here are their demos: Analog, Webalizer and Awstats.
Because we tend to be so concerned with numbers, the first thing we want to know is the number of visitors the site has received. More important however is what we learn about those visitors.
Which pages do they prefer? Where did they came from? From the search engines? Which search engine? What were they searching for in that search engine? From a link exchange page? On what website? From a directory? From a message board? etc.
The answers to those questions can provide information that can and should be acted upon. Knowing the sources of traffic can tell you a bit about how targeted that traffic is. It can also give you hints on what to do to increase the traffic in the future. Suppose you notice that much of your traffic comes from directories. That should be a hint that this is an effective technique to bring in even more traffic.
One great thing is that traffic from theme related directories is rather targeted. We’ll discuss (Or rather I’ll discuss… by myself ) about the quality of traffic a bit later.
You’ll also notice that not all pages bring in the same amount of traffic from the search engines and that not all search engines bring in the same amount of traffic. As I write this Google surely is the most generous search engine. Although Yahoo and MSN have comparable audiences they fail to send a lot of free traffic to websites. Why? Money is the answer, but let’s not get into that! The subject here is traffic statistics and analysis, not search engines and their behaviors.
As I was saying… some pages receive more traffic than others. Once you know which pages get the traffic, you know on which pages you should concentrate your efforts to get more sales/revenue/newsletter sign-ups or whatever you want.
I had the pleasure of experiencing Urchin, a commercial web based logs analyzer which was included in a hosting package that I used. My initial impression was that the interface is a lot nicer compared to its free siblings Analog and Webalizer and that it’s also a bit more powerful.
However, after using it a bit more I found it to be a great tool and I would certainly recommend you to find a host that has it installed, especially if you’re not going to pay for raw logs analyzing software. Sure, it’s not crucial to get Urchin, but it’s a very nice feature nonetheless.
When I had it, it helped me understand more about my visitors, about the way they travel inside the site, which in turn helped me make decisions regarding the appearance of the site. If it proved to be useful for my site which is mainly informative in nature, it must be ten times more useful for a purely commercial website.
Raw logs analysis
Raw logs are very useful indeed. Once you download them you can use all kinds of software to analyze them. Some programs are powerful, others not so powerful; some have nice graphics and charts, others have ugly ones.
A nice program that I’ve used is WeblogExpert. I like that it’s easy to understand and it has nice graphics and stats. I also noticed that it develops very fast! It even has a lite free version which is quite good! Sure, there are lots of other programs out there and this is just one piece of software that I happened to like. There are plenty to choose from at Dmoz.
Downtime and statistics
It can be useful to have detailed statistics because if your website is popular enough to ensure a relatively constant flow of visitors, you can determine downtime based on lack of traffic in the logs. Say your website receives an average of about 200 visitors/hour, but basically never less than 75. If for some reason you spot a 3 hours break in the stats, it’s most likely that the site was down for that period of time.
Is is all about the numbers?
As I was saying in the beginning of this article, everybody is concerned with the amount of traffic. While higher numbers of visitors usually translate in more sales or other desired actions from the part of the visitors, it’s often the quality of traffic that determines profitability.
The return on investment (ROI) is a much tooted profitability indicator and it can be seriously improved if the traffic you bring to your website is highly targeted. Suppose your website sells diamonds. If you bring traffic from a programming forum, that will result in very few sales – if any. If you bring in people searching for "cheap diamonds" in a search engine, you stand a considerably higher chance of landing a sale because that traffic is targeted.
Also, it’s not always about bringing more traffic, but about getting more from the traffic you’re already getting. It’s a known fact that sales can be increased impressively simply by changing the wording of the presentation. One could argue that it’s a lot easier and cheaper to increase sales by 50% by altering your sales presentation rather than by getting 50% more traffic. It is my belief that the best solution is to concentrate efforts on both techniques.
For a reasonably successful website, making decisions based on simple raw logs, Analog or Webalizer, is not very smart. There are advanced solutions out there that can track with pinpoint accuracy the path of the visitors, the visits that have resulted in sales etc. Implementing such advanced tracking is a good decision once your website can afford it. Here are some of the competitors in this field: Clicktracks, WebSideStory, Webtrends and Urchin.
Logs can be very useful for any website owner. Often this small amount of information is enough to determine great and very beneficial changes to a website. A hosting package can provide good tools to analyze your traffic and any webmaster should use these tools to ensure that the website performs at its best.