Client Login

Consultation and review 1800 288 944

Home » Articles » SEO Algorithms: An Introduction

SEO Algorithms: An Introduction

If there is one thing every webmaster needs to learn, it's what a search engine algorithm is and how it affects their placing on the results pages. In this document you'll learn not just what the algorithm is, but also what this means in practical terms for the way that your site should be designed in order to achieve good search engine placement.

What is a Search Engine Algorithm?

For such a complicated concept, search engine algorithms are actually quite easy to understand. Basically, a search engine algorithm is the mathematical algorithm that the search engines use to rank pages in their index. As an example, the algorithm may look something like this :

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

What you're looking at here is the original PageRank calculation that can be found in Sergey Brin and Lawrence Page's proposal for Google on the Stanford University website. Now this is NOT the current Google algorithm, although the current algorithm does also contain the PageRank calculation. As you can imagine, the precise algorithms of all major search engines are closely guarded secrets because if webmasters knew what the algorithm really was, it would be open to manipulation; what those in the industry refer to as 'gaming.'

Of course, it's not the idea of the algorithm itself that is complicated, it's what the algorithm consists of. In order to understand today's algorithms, you need to understand how they have evolved from their original form.

Once upon a time, algorithms were very simple. They were designed so that the search engine would simply search for the search terms (sometimes referred to as the 'query') in the content and meta tags of that webpage. As you can imagine, and as you can probably remember if you were using the internet in the 1990s, this meant that search engine rankings were open to manipulation. Early webmasters would simply enter as many high-ranking search terms as they wanted into their pages. These terms didn't even have to be connected, so you would have webpages whose meta keyword tags would look something like this;

<META name="keywords" content="Brad Pitt pictures, buy Mexican medicine here, motor homes for sale, how to get pregnant fast, make money online, work from home, Canadian pharmacy, free downloads, start your own business, Angelina Jolie naked pictures, cheap flights, poodle grooming, hairstyle design">

As you can imagine, this situation quickly had to change or the internet would have become swamped with completely meaningless websites. What happened was that search engines quickly began to ignore keyword meta tags altogether. Their algorithms were updated to ignore these tags.

Of course, as fast as the search engines were updating their algorithms, certain unscrupulous webmasters were working to find ways around them. These unscrupulous methods are now known as 'black hat' SEO methods and are still very much in practice today, perhaps more so than ever before. In order to keep up, major search engines change their algorithms constantly. According to the experts over at SEOmoz.org Google changes its algorithm around 500-600 times a year, rolling out a major change every few months.

As we speak, Google's latest Panda algorithm is onto version 3.4. Since it was launched in early 2011, the Panda algorithm has been successful in dividing low-quality websites from truly informative and well-designed ones. A further update in February this year resulted in a number of high-profile casualties of war including ezinearticles.com, hubpages.com, business.com, suite101.com and many other supposedly well-established and high-ranking article directories. Like many other updates to the algorithm, this latest significant change means that a rethink is needed in terms of current SEO strategy, which has depended heavily on article linking for some time now.

The situation was described well by SEO expert Amanda MacArthur:

'Hi, and welcome to the Internet, where the Google Gods will continue to take away as many "cheat" sites as they can. If you (and every other marketer) spends hundreds of hours on a site that is temporarily giving you a boost in traffic, you better believe that Google will find it and dismember it.'

But if every SEO strategy is going to be discovered by Google and the other major search engines, what is the point in trying? Well, the point is that Google is not the enemy. On the Google Webmaster Central Blog, Google Engineer Matt Cutts has explained that Google is not against SEO strategies per se:

'Google has said before that search engine optimisation, or SEO, can be positive and constructive – and we're not the only ones… The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfil their information needs. We also want the "good guy" making great sites for users, not just algorithms, to see their effort rewarded.'

The key, then, is not to try to 'trick' the algorithm, but to understand the type of sites it rewards and the type that it penalises, and build your site accordingly.

Natural Optimisation

Natural optimisation is the process of trying to gain greater visibility for your site in the search engine results pages through 'white hat' or 'green' methods, i.e. by trying to aid the search engine crawlers in their indexing process, and by creating sites that are unlikely to be penalised.

'White hat' SEO is all about competing with other sites in your niche, while 'black hat' SEO is all about tricking the algorithm. Over the long term, white hat methods are far more viable that the black hat methods that the search engines seek to weed out with every algorithm update.

Here are some of the most common natural optimisation methods:

  • Keyword placement and emphasis
  • Hierarchical site design and usability
  • Internal links and backlinks
  • Meta title and description
  • XML sitemaps and robots.txt files

You may notice that both keywords and backlinks are on this list of 'white hat' methods despite having quite a bad reputation. It's true that both of these methods have also been used in 'black hat' SEO techniques. When I talk about using keywords, I don't mean keyword-stuffing, I mean using them when it comes naturally. When I talk about backlinks, I mean legitimate backlinks from real sources.

Does Natural Optimisation Really Work?

Of course, there may be a few of you out there who don't believe that creating a well-made and high-quality site with a clear sitemap is enough to gain first page results. Surely a site that uses the keyword fifty times will gain better ranking than a site that only uses the same work naturally ten times? Surely a site that has spammed every article directory out there to gain a backlink will have a better ranking that a site that patiently waits for its helpful neighbours to genuinely link it?

Well, the important thing to remember is the ever-evolving and really quite clever algorithm. The algorithm can now tell if a keyword is being overly-used, and if this affects the readability and grammar of the content that site will be penalised. Some links are considered to be better quality than others, and will make more of a difference to your ranking. Links that come from disreputable sites (and the now heavily penalised article directories) won't mean as much.

Purely because the algorithm is secret and because it is always evolving, the only way to gain a high ranking and to maintain it is to do it with natural methods.

F.A.Q.s

  • Does the algorithm penalise excess keyword density?
    Keyword density is not thought to be a factor at the moment. While keywords should never be used excessively, because this will contribute to a nonsensical page that reads badly, they are still the essence of all search engine optimisation.

  • Do all search engines have different algorithms?
    Yes, although of course we don't know what they are. It's assumed that all search engines learn from each other and follow generally the same pattern, in order to compete with each other. For example, if Google is able to weed out all nonsensical ad-stuffed webpages, AskJeeves had better be able to do the same thing or people will no longer go there to search for information.

  • How can you compete with other websites without using underhand methods?
    The only way to compete is to provide better, more informative information and to provide more of it. Content is incredibly important at the moment, so the more high-quality content you have the better. Have a better, clearer site design and make sure you are utilising every natural optimisation method open to you.

  • My site has been penalised by the latest Panda update, what do I do?
    If you go to the Google Webmaster Central Blog, there's a great explanation of what the Google algorithm is looking for, and the sites that it's looking to weed out. Look closely at the content of your site and ask yourself the following questions ; is it grammatically correct and well-referenced, are there any spelling errors, has the information been written by an expert or just a random person, does it really say enough about the subject, do you think it tells your readers what they wanted to know?

  • How do you get backlinks without using link farms?
    There are plenty of ways to get good quality backlinks including guest blogging, commenting on blogs, creating YouTube videos and a social media presence, which are all discussed further in the white paper 'Natural Optimisation – Off-Site Techniques.'

Conclusion

You should now have a good understanding of exactly what a search engine algorithm is, how it affects search engine results page rankings, and how major search engines must change their algorithms constantly in order to avoid manipulation by 'black hat' SEO experts. Hopefully you should now be beginning to grasp why it's a far better idea to use natural optimisation techniques, and how you can use these to your advantage. For more information on how to naturally optimise your site, please see the white papers 'Natural Optimisation – On-Site Techniques,' and 'Natural Optimisation – Off-Site Techniques.'

Search engine optimisation can be a time-consuming and complicated process, which is why more and more businesses are turning to SEO specialists to help them get ahead of the competition. Zanity provides a full range of on-site and off-site search engine optimisation services that will have you at the top of the organic listings in no time. Contact Zanity today for a health check up on your current web presence and find out what we can do for you.

Speak to an Expert

Phone 1800 288 944 Email contact@zanity.com.au Address Offices Australia Wide

Offices Australia Wide

Book a
Consultation