Evangelism by Search Engine – Part 4 - Off-Page SEO Factors
Posted by Bill Anderton
In Part 3 of this series, I wrote about the two different methods that most modern search engines use to rank search results: (1) on-page factors; and (2) off-page factors. In Part 3, I also covered details about how the on-page factors work. In today’s blog posting, I am going to write about off-page factors used in ranking pages shown in search results and how they can be used to greatly improve search results for end users as well as the websites that do basic SEO processes.
The simple fact of the matter is that the content of pages alone cannot reliably determine the relevance of the pages matching the search query. One of the big reasons is that all of of the on-page factors are totally in the control webmasters. It would be very easy for webmasters to easily manipulate the on-page factors in order to game the system for higher rankings. Temptations are great because there can be a lot of money from advertising or product sales at stake and some commercial websites depend of something like 70% of their total traffic and almost 90% of their revenue coming directly from search engines referrals.
Webmasters face a lot of temptation to learn all they can about how search engines rank pages and then use this knowledge to manipulate the various factors for more traffic. Some less-than-totally-scrupulous actors have been known to manipulate the factors even if it results in a page being referred to the user that is totally irrelevant to the search query. It is not in the interest of the search engines to make a lot of irrelevant pages referrals to their users; people would switch to another search engine if it happens too often.
Less-than-totally-scrupulous webmasters want the traffic; period. In other words, webmasters aren’t always the most unbiased of people. Putting all of the ranking within factors that they can easily manipulated would be just plain dangerous.
Yes, search engines depend on on-page factors to rank pages but they also use a lot of off-page factors in their calculations. The general feeling is that off-page factors can be more unbiased and are typically much more difficult to unfairly manipulate in order to game the system.
All search engines have ranking algorithms that integrate both on-page and off-page factors in a synergistic way using complex calculations to determine the ranking of each page in their indexes.
Each of these factors is called a “signal” and all search engines collect a variety of signals to put into their algorithms.
Some signals are harvested from search engines' own user data that track how user interact with content on referred pages and their behavior when they see the content. For example, if your pages are shown to people in SERPs (search engine results pages), what is the click-through rate? How many people actually click on your link in the SERPs and visit your pages? Once they click into your site, do they stay only a couple of seconds and leave? This is called the “bounce rate.” If they stay, how long how do they stay and how many pages do they click to view in your site?
In other words, search engines are adaptive; they can adapt their rankings of pages based on the actual usage patterns of end users that interact with the pages. A webmaster might claim relevance by using certain keywords but if the page really isn't relevant or engaging, a high percentage of users will bounce out of the page and go elsewhere. They can also reward good sites that exhibit good content made evident by how your visitors actually use your content.
By the way, this is why so much emphasis is placed on having good and engaging content.
The advantages of these techniques are that they aren’t subjective measures and they don't rely on opinion. Instead they use behavior that does a pretty good and accurate job of assessing the subjective nature of content by simply assessing how the people they send to your site actually use the content that they find there.
It is really a simple concept; sites with good content will have visitors who exhibit behaviors of actually engaging and using the content. On sites with poor or thin content, visitors will “vote with their feet” and run away from bad content pretty quickly. All the search engines have to do it track this behavior, harvest the patterns as signals and they can algorithmically assess your content.
Think about the elegance of this approach; it lets the crowd using the content rate it by how they use it. Nothing to do, no survey and no focus groups. Every click gets tracked as it happens and the clicks record behavior. Also, these technique integrate and test (validate or refute) all other SEO techniques and is almost impossible to game the results.
Often, when I speak at church workshops and mention this method of collecting signals and how they are used in ranking pages, I notice the rapid onset of seat squirming. People become noticeably uncomfortable and begin to squirm in their seats. It is a wave that ripples through the room. Typically, when I poll the audience, I find that the same people who are squirming in their seats rate their own content on their websites as poor. One said, with a grin on his face, “That's not fair. You mean people are going to actually look at my stuff! And … I get evaluated by how they DON'T use my content. My content can’t hide behind the fig leaf of anonymity? I’ll never get ranked.” Yes, this pastor was grinning because he already knew that he had to do a major upgrade his website and was in the workshop for that purpose.
However, many haven’t come to this catharsis yet; they don’t yet know the full impact of having very thin and/or very poor content.
Many more signals than just the ones I cite above are collected and used. It is widely report that Google collects at least 200 signals in its evaluation of content. It is equally widely reported that this number is very likely to be 5x to 20x more than 200! An employee of Bing has stated that they use 1,000 signals. This too is considered a low estimate.
Regardless for what is reported, it is known that the search engines regularly tweak the signals they collect and how their algorithms process the signals. Google has a group that meets at least weekly to make these adjustments.
Go back to the diagram at the beginning of Part 3 of this series (repeated below.) The search engines employ a lot of very smart people to continuously improve their technology to deliver upon the concept shown in the diagram.
As an industry, we don’t actually know many details about the search engines’ signals and algorithms; they are super-secret and proprietary well beyond being considered trade secrets; think secret like nuclear-launch-codes secret. Why? Because the marketplace will reward the search engine with the best most-accurate ranking algorithm with lots of money! Google’s annual revenue has topped $50-billion and has a market-cap value of $285-billion as of today. It is a very competitive marketplace.
Also, competition and money aside, it is not in the interest of the search engines to share too many of the details of their algorithms so webmasters won’t know how to unfairly manipulate the various factors. Secrecy makes it difficult to game the system.
The search engines do publish tips and recommendations of SEO best practices. These techniques are called “white hat SEO.” There are also certain unapproved techniques that may (or may not) work and are used intentionally to try to game the system for an inappropriate advantage. These techniques are called “black hat SEO.”
There are lots of professionals like me who study the search engines closely and scourer the open sources for any hints about how search engines’ algorithms work. While we’re basically only reading the tea leaves, we do know, more or less, how the search engines algorithms generally work but we don’t know many of the details; we know some things fairly certainly, some things that a probably true and some things that are a complete mystery.
This is why there are so many “experts” in the SEO business. We all are dealing with a purposely-opaque process (because of the designed-in secrecy of the search engines’ signals and algorithms) that keeps a lot of things a mystery.
There is an adage in this business that says, “Where there is mystery, there is margin!”
Naturally, the marketplace has responded to fill the void naturally created by this mystery. There are many people who claim to be experts with special insights into the rubric for increasing ranking of web pages. Some are true professionals and are very good; they deliver dramatically increased traffic from search engines and make lots of money selling their expertise. Others, unfortunately, are selling black-hat SEO techniques that use techniques that are not approved by the search engines, might actually use deception and can potentially harm the page rankings of the pages that use these techniques. If you check your SPAM filters, you will likely find some unsolicited commercial e-mail promotions from many of these so-called experts who claim they can work wonders with your rankings. Be careful of these sources.
Like engaging all professionals, “let the buyer beware” and be sure that you check references.
Typically, because of the expense involved, only the largest online ministries use professional SEO. The vast majority of churches must do SEO themselves. As I wrote about in Part 3 of this series, churches of any size and easily learn to do the basics and benefit from improved visibility of their websites in the search engine. In Part 3 of this series, I wrote about on-page factors because poorly-written and un-optimized text on the page will very likely render your pages all but invisible in search engines. In Part 4 of this series below, I’ll write about off-page factors that can add even more visibility to your pages if you want to build upon the foundation provided by good content.
You should use both sets of techniques (on-page and off-page); all of these techniques work together. Some are weighted more than others but no single factor will guarantee top ranking. Employing several techniques increases your odds and the more of the techniques you use (along with others not listed here), the better your chances of getting top rankings.
Plan your SEO strategy around those things that you can easily do initially and then improve your techniques as you go. If you don’t have a plan to get your content found, it very likely WON’T be found. On the other hand, since most churches don’t do any SEO at all, those churches that do even simple SEO techniques will be rewarded with better ranking more quickly and would otherwise be experienced in more competitive online arenas and with highly-competitive keywords.
Discussion of the use of specific off-page factors that can increase the ranking of your pages in the search engines must begin with building links to your pages from third-party sites. Links to your site from other web pages are big factors in improving your rankings because each link is, in essence, a “vote” or an “endorsement” for the linked page. Typically, people only link to quality content so links provide good low-noise signals for rating valuable content.
Search engines generally look for three things when evaluating the links to your pages:
While building links does require some effort on your part, your efforts will be nicely rewarded. Also, building links is not inherently difficult to do; churches have a large pool of sources of links than you can draw upon:
Be prepared to ask others to link to you regularly; just once isn’t enough. Have a standing plan to do it often.
The best way to get links is to have content on the linked page that is worthy of the link. Churches that are deeply involved in mission and community service should have lots of opportunities for links. In other words, don’t just ask for something; provide something of value in return like quality content on a worthy topic of general interest that they would be proud to like to.
By the way, churches being churches inherently have a lot of advantages over commercial companies when it comes to building links. However, few churches ever use these advantages and have few inbound links to their websites.
Yes, it is a little work; but not much! And, it has a big payoff for your rankings.
Other off-page factors beyond links include:
Reputation on social media – How respected are on social media?
Shares – How many people share your content?
Authority – Do your links, shares and content cite trusted authorities?
History – How long has your site/domain been around?
Country – What country is your domain located in?
Locality – What city is your domain located in?
History – Do people regularly visit your site and return to it?
Also, search engines do consider a some negative things in their analysis of your pages and penalize rankings based on these negative factors. Such things include violations or the use of certain “black hat” SEO techniques and if people block your site from search engine results.
Like I wrote about in Part 3, please note that I have NOT written about all of the tricks of the trade in this posting. These things are just the most basic things that will yield the most “bang for the buck” (the time it takes to do these things). Although basic, the techniques discussed in this posting will dramatically improve your rankings in the search engines and that should bring a whole group of new visitors to your website.
Category: (05-13) May 2013 Tag:
This is only the blog's abstract. To read the full text and participate in all of the interactive features of the community, please register. It's FREE!