Client Login

Consultation and review 1800 288 944

Home » Articles » Natural Optimisation – On-site Techniques

Natural Optimisation – On-Site Techniques

As discussed in the white paper 'SEO Algorithms – An Introduction,' natural optimisation is the best way to avoid penalisation by search engines, and to ensure your website's long term presence in the top results pages. In this paper, you'll learn exactly what natural optimisation on-site techniques are and how you can use them to your advantage.

What is Natural Optimisation?

In order to understand what natural optimisation is, you need to first understand what natural listings are. Natural listings are the results that show up naturally on the SERP (industry jargon for search engine results page) due to scoring highly on relevance because of that search engine's specific algorithm ; not including the advertisements or paid listings that may also show up on that page. Natural listing is incredibly important simply because it's free, which means that if you can rank highly using natural techniques you may not need a pay-per-click campaign or a paid listing.

When we talk about natural optimisation, we are talking about the process of getting pages to rank highly not by attempting to manipulate results , per se, but simply by creating pages that fulfill the algorithm's requirements. Natural optimisation is also known as 'white hat' SEO, as opposed to 'black hat' SEO. White hat SEO is the use of open and honest SEO techniques such as keyword relevance and good site architecture, whereas black hat SEO is the use of various tricks and techniques which attempt to fool the algorithm. While black hat techniques can work in the short term, all major search engines are continually working to thwart these techniques, so they usually become ineffective very quickly and don't tend to make practical sense, especially if you are running a legitimate business which needs a stable web presence.

On-Site Techniques for Natural Optimisation

When it comes to on-site natural optimisation techniques, we can roughly divide these into two categories which are equally as important: the site's content and its architecture or design. I'll talk about architecture first, as it's important to incorporate SEO techniques into the very design of your site. The most search engine friendly sites have had SEO in mind from the very first.


First things first then, so I'll begin with the hierarchy of your site. You can think of this in terms of your site map, your linking structure, and the categories and sub-categories your site is divided up into. Basically when I talk about hierarchy I mean the way that your site is organised. You may not have realised that this is incredibly important for SEO purposes; many people don't.

Before I talk about exactly how to best organise your site, I'll explain why it's so important. Now, when a search engine crawls your site it makes for much easier indexing if the site structure can easily be identified. If, for example, a crawler ends up on one of your pages, which is only linked to one other , and contains no link trail or site map, that crawler won't be able to easily get a picture of your whole site.

What the crawler is looking for is a well-organised and user-friendly site, so this should be immediately apparent on EVERY page. Every page should contain a link trail that returns back to the home page, no matter how many links have been followed to get to that page. Every page should contain links to the homepage and main categories. This will allow a very clear picture of organisation to be returned to the search engine, which will benefit your site's ranking.

Linking Structure

Another aspect that's important to understand in natural search engine optimisation is how your internal linking structure can affect both your PageRank and where you end up on the SERP. So, first of all, we all understand that links are important, but of course , links to your own pages from each other counts as links too, so it's important to link your pages to each other even apart from for clear hierarchical purposes.

PageRank works by an algorithm that was first discussed by Lawrence Page and Sergey Brin in the original Google paper 'The Anatomy of a Large-Scale Hypertextual Search Engine,' which can still be found on the Stanford website ;

PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))

Basically what this algorithm does is calculate the amount of 'votes' each page has received by other pages on the web. If a page is itself highly ranked, it is able to confer more of a 'vote' on the pages that it links. This is why you may have heard various people talking about the importance of getting backlinks from 'authority' websites; they are able to confer a high amount of their PageRank power onto you.

You are also able to control your PageRank in the way that you link internally. Now, if you were to link in a circular structure, like this:

algorithm 01

Each page would receive the exactly same PageRank because each has conveyed the same amount of its 'vote' on the other. This is perfect if you'd like each page to have the same rank, but if you remember what I said about hierarchy, you'll know that this wouldn't create the user-friendly easily-identifiable structure that the search engines are looking for.

The structure of a nicely organised and clearly hierarchical site would probably look something more like this ;

algorithm 02

As you can see, there are a lot more links pointing to the homepage than there are to any other page, which would mean that your homepage would end up with a far greater PageRank than any other page on your site. What's the problem with that, you may ask? Well, by giving all your PageRank to your homepage you're putting your eggs all in one basket, so to speak.

However, many experts would agree that clear hierarchical structure is more important than PageRank. PageRank is only a small part of the algorithm used to determine actual search results relevance. Practically, and especially if you have a large site, you have no choice but to convey most of your PageRank on to your homepage.

However, you can always boost the PageRank of your whole site by obtaining backlinks from other high ranking sites. You can find out more about how to do this in the white paper 'Natural Optimisation – Off-Site Techniques.'

Linking Elsewhere

Now, considering what I've just said, you may be wondering why exactly you've heard it said that you should try to link other related websites for SEO purposes. Well, SEO is a complicated subject and contains much contradictory advice. By linking other pages you will be giving away a certain amount of your PageRank vote that could have been used to strengthen your own PageRank internally. However, as I've said before, today's search engines want to see that your website is an authority on its subject, and relevant links help to convey the impression of an authority, especially if those links are referred to in anchor text that makes the subject of the link clear.

Basically, it's a matter of balance. While you want to maintain a good level of PageRank for yourself, you also want to create the impression of an authority with relevant links. You should remember that PageRank is only one aspect of the actual search results presented, and many white hat SEOs would maintain that PageRank is no longer important at all (while many others would say that it's just as important as it always has been). Without knowing the actual algorithm itself, there is no way of knowing exactly how important PageRank currently is to Google results. However, in absence of this knowledge it's best to optimise in every way you can.

Overall, link to other authority websites, but not excessively, and try to create a strong and balanced internal linking structure.

XML Page Maps

Including an XML page map in the files of your page is crucial to good SEO because it makes indexing far easier. Along with having a clear hierarchical structure, this lets the crawlers from search engines see exactly what pages on your website are available to crawl, so that no page on your site ends up being left out. It also helps the crawler know of updated content due to the frequency settings,e.g. if you update your blog weekly this will inform the crawler that the new content should be crawled every week. The XML file should always be placed in the main public access folder of your site so as to be clear and easily accessible.


You should now have quite a good understanding of how a site should be designed for natural optimisation. But what you actually place on those pages is just as crucial as how you arrange them. The next section explains the theory behind content optimisation.

Keywords & Content

Now, you may have heard the warning term 'keyword stuffing' floating around the Web. This has perhaps led you to the illusory belief that keywords are no longer important or are dismissed by search engines. This is most definitely NOT the case; keywords are just as important as they always were, if not more so. Whereas before it was the meta keywords tag in the header of your site code that was crucial, this was abused by unscrupulous webmasters and has now become almost irrelevant except as to signal which keywords the rest of your page is optimised for. The important thing is placing your keywords in the actual body of your text, as many times as is natural. Natural is the most important thing to remember , because since Google's Panda update the algorithm has become a lot better at recognising which content makes sense and which is basically keyword-stuffed gibberish.

Keywords have always been the one crucial aspect of your page that tells the algorithm what your site contains and therefore its relevance to the search query. Each page should be optimised for perhaps one or two phrases, and should contain variations of these phrases.

However, since the Google Panda algorithm update it is now of the utmost importance that the content your site provides is useful, informative, well-researched and referenced, and grammatically correct, if not as well-written as any magazine article would be. When introducing its Panda update, Google gave some guidelines on how to think about your content and how to decide whether or not it is quality. On the Google Webmaster Blog, Google employee Amit Singhal wrote that webmasters should ask themselves these questions :

  • 'Would you trust the information presented in this article?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Was the article edited well, or does it appear sloppy or hastily produced?'

In a nutshell, what Google (and following on from them, the other major search engines) are looking for is very high-quality content in every sense of the word. They are attempting to weed out the nonsense-purveyors of the internet, and are on a mission to boost sites which give good quality information. The art of SEO is in balancing quality content with keyword density.

Meta Tags

Google has explicitly said that the keyword metatag no longer means anything to them , but that's not to say it doesn't still have a small amount of relevance for other search engines. It is advisable to include it if only for reasons of consistency, and to remind yourself which keywords and phrases you're optimising for on that specific page.

The meta description tag is still important, however. This is the phrase that should appear beneath your page in the search engine results list, and is your chance to interest readers enough so that they follow through with a click. It's never a good idea to attempt to be too persuasive, however. Anything that looks remotely like spam, such as 'click here, everything you could ever want at cheap prices, click here, click here!' is likely to be penalised. Even if it does manage to get through the algorithm, it will probably be ignored by real readers.

Keywords should be used one or two times in your description, but you should also provide a genuine and informative description of the contents of your page which inspires readers to visit. You might want to employ a good copywriter to get the wording of this just right.

It's also important to include a meta robots tag and corresponding robots.txt file. This will tell the crawlers which pages to ignore, which pages are more important than others, and which are more frequently updated.


You should hopefully now have a good grasp of exactly what on-site natural optimisation techniques are, but of course on-site techniques are only half the story. For more information, see the white paper 'Natural Optimisation – Off-Site Techniques.'

Search engine optimisation can be a time-consuming and complicated process, which is why more and more businesses are turning to SEO specialists to help them get ahead of the competition. Zanity provide a full range of on-site and off-site search engine optimisation services that will have you at the top of the organic listings in no time. Contact Zanity today for a health check up on your current web presence and find out what we can do for you.

Speak to an Expert

Phone 1800 288 944 Email Address Offices Australia Wide

Offices Australia Wide

Book a