Facebook Twitter Linked In Google+ Vimeo YouTube Flickr     My Account     icon_home_white_small.png  Public Home     Connected Community Lobby     Contact Us

Public Sample of the
Online Ministries Community Blog

Apr 02
2013

Do You Know Where You Stand?

Posted by Bill Anderton

One of the standard questions that I always ask when I first meet with a church about their website is, “How many visitors do you have to your website?”

Many churches don’t know. They do not look at their web servers’ log files on a regular basis or even at all! Of those that do, often they don’t know what to look for or how to interpret the data they see.

A homily says, "You can't manage what you can't measure." While this quote is incorrectly attributed to William Edwards Deming, the great statistician and consultant, it still contains a lot of truth. If you don’t know a great deal about how your website is performing and the people who visit it, you will never know how effective your site is doing AND you will not know how to make changes to become more effective.

PDCA_Cycle_svg.pngDeming did advocate a "Plan-Do-Check-Act" cycle (PDCA) that is still great advice to people planning online ministries. The "check" step in Deming's cycle was the study of the actual results of your "do" phase. The "check" phase provides an informed basis for corrective/improvement actions to be taken in the "act" phase.

Deming’s “do” phase included the collection of data. He is quoted as saying, “In God we trust; all others must bring data." By putting the collection of data into the “do” phase, he placed an importance on building the foundation for the later “check” phase where the important analysis of the collected data takes place.

Collecting data and analysis are very important, even critical, in the management of websites and online ministries. There will always be differences, even significant differences, between “as planned” and “as realized.” Without a “check” phase, we won’t know how to make changes. Worse, uninformed changes could easily do more harm than good!

Of course, this assumes that you are actively managing your website and continuously trying to improve it which I believe is absolutely essential for ALL websites. Note that Deming's PDCA cycle is circular and never ends, the "act" phase goes directly back into the "plan" phase to restart the cycle.

Fortunately, all web servers produce detailed logs about the visitors who visit each website. Also, there are many very good log-analysis software packages (some free) that will turn the raw log data into reports more easily understood. Further, Google makes an excellent tool available (both as a free version and a premium version) called Google Analytics (http://www.google.com/analytics/index.html). Taken together, all of these tools produce a wealth of information, freely available to webmasters to wish make the effort to monitor their website and use the information as a basis for improving their websites and ministries.

The raw server logs are used to feed the log-analysis software. Most web hosting companies will provide the reports produced by least one analysis package. The reports are usually generated regularly (usually daily) and saved to the administrative control panel of the website. These reports can typically be viewed by logging into the web server’s control panel. Some hosting companies produce real-time intraday reports so you can monitor your visitors throughout the day.

Once you get your hands on these reports, the next requirement is to understand what the reports are telling you.

While all of the log-analysis software packages are subtly different, most report some variations of the same things:

  • Visits - Number of visits made by all visitors. Visits are actually "sessions" where a unique IP accesses a page, and then possibly requests more pages within a specified time period. All of the pages are included in the visitor’s session; therefore you should expect multiple pages per visit and multiple visits per unique visitor.
  • Pages - The number of "pages" viewed by visitors. Pages are usually HTML, PHP or ASP files, the “page container” itself. The pages parameter is NOT images or other files requested as a result of loading a page.
  • Hits - Any files requested from the server. This includes pages (html, php or asp files) plus all other files such as images, css files, document files, etc.; basically all of the elements that are embedded in an html page and served by the web server.

A classic mistake of newbie webmasters is to put too much weight (importance) on the “hits” metric. It is always the far bigger number because modern web pages have dozens (even approaching hundreds) off elements that the web server serves upon each page request. The size of this inflated number is so big, newbies want to believe it! Also, the popular culture from things like movies always talks about “hits” and also misinterprets “hits” as “people coming to the site.” While an important metric for server engineers, “hits” do not measure people and therefore has little value or meaning to the webmaster.

It is important to first gain a good understanding on the number of human visitors visiting the website. Importantly, we want to understand the number of “unique visitors.” Unfortunately, many of the standard reports only report “visits” and NOT “unique visitors.” The “visits” metric typically measures sessions. The same person coming back to the site several minutes later might be counted as a new session and therefore a new “visit.” In other words, “visits,” while an important metric, can be inflated to give a false sense of the number of people visiting your site.

Some reporting software specifically report “unique visitors” and this is the most important metric.

My own definition of a “unique visitor” is as follows:

A "Unique Visitor" is defined as someone who visits in a single session from a unique IP address where the session is a series of clicks on hyperlinks in our site (one or more "pages" and many "hits") by this individual visitor during a specific period of time. A Unique Visitor session is initiated when the visitor arrives at our site and it ends when the browser is closed or there is a specified period of inactivity (typically an hour or more.)

Yes, by this definition, “unique visitors” will be the smallest of all readily available numbers but it is the most telling.

However, we’re not fully done yet. The next important question to answers is, “Of all of our unique visitors, how many were human?”

No, we’re not discounting space alien visitors but we do want to quantify the number of robot (non-human) visitors. Robots are “web crawlers” who come into website to gather information. Most web crawlers are coming into the site for legitimate, even valuable, reasons such as the web crawlers from the search engines that gather the information to get our pages indexed in search engines. These types of robots are our friends.

However, we don’t want to count robots with the same weight (importance) as human visitors.

It is not unreasonable for a small church website to have 30-100 robot visitors per day. You might find that your website is yielding 60 unique visitors per day but deeper analysis might show that 57 of them are robots that are crawling your site. Yes, that’s right; a site could only yielding three unique assume-human visitors per day! By the way, this is NOT an abstracted example but the actual results from my own church when I started improving their then six-year old website. We knew the site wasn’t very effective but it was shocking to see how badly it was performing.

Therefore we have to do a simple math subtraction of starting with “gross unique visitors” and subtracting the number of unique visitors who are robots.

Some hosting providers will report the number of robot unique visitors and this makes thing easy to get to “net assumed-human unique visitors;” the number that is most important in analyzing the effectiveness of your website.

If your web host doesn’t report the number of robots, I use a simple trick that will provide some insight to the number of robots coming into a site. I always use a “robots.txt” file in all of my websites. By convention, all well-behaved robots should check this file upon arriving at a website. See http://en.wikipedia.org/wiki/Robots_exclusion_standard for more information. With a robots.txt file in place, I can check the number of unique visitors who land first into this file and can assume them to be robots (no human has a reason to visit this file much less land into it at the beginning of a session.) I can then use the gross unique visitors metric and subtract out the unique visitors who landed in the robots.txt file to arrive at “net assumed-human unique visitors.”

Producing a report of “net assumed-human unique visitors” is NOT the ending of a full and complete analysis of a website but it is a good starting point. I not only check this number daily, if I am sitting at my desk, I will likely check this number hourly to see how my sites are doing for that day. I will do a more-detail analysis of my sites weekly and a full in-depth analysis monthly, but I check the “net assumed-human unique visitors” number as a quick indicator and a “trip wire” to trigger a deeper look if the results go very high or very low.

More reporting will provide more information into how your website is being used. Adding something like Google Analytics will produce even deeper insight.

A webmaster simply cannot have too much reporting and analysis.

I future blog postings, I will discuss analysis in more detail.

Category: (04-13) April 2013   Tag:


This is only the blog's abstract. To read the full text and participate in all of the interactive features of the community, please register. It's FREE!

Click Here To Register Into This FREE Community