Published on the Sunday Times (TechSunday) 21 September 2014
It is a known fact that the higher a website ranks for related keywords on a search engine, the higher the traffic and this will subsequently result in higher profits or exposure for the company or organization owning that particular website. Twenty years ago, in 1994 Yahoo started operating, website owners did everything they could to start ranking high in the search results after it became evident that most internet users were starting their virtual surfing sessions from search engines.
At the time it was quite straightforward to rank high, webmasters would just stuff their webpages with keywords that describe the service or product they provided, wait a couple of days and voila, they would get on top of Yahoo or Lycos, at least for a few days until their competition reaches up with them again.
Around 1998, Google appeared on the scene, using the proprietary algorithm called Pagerank™, websites started being ranked by how many incoming links they had, each website was awarded a value and the higher this value (Pagerank) the higher it would appear in the search engine page results. Soon all other search engines adopted similar technology to rank websites after it was evident that such a system provided enhanced search results. The search engine optimization industry was born as a result of the introduction of the Pagerank technology. An entire multi-million industry revolving around the business of trying to rank websites on top of the search results was born.
Since 1998, virtual link farms, reciprocal links, sales of links and many other techniques were used to try to circumvent the Pagerank system and try to get one’s website to rank as high as possible. It was a cat and mouse game between the search engines and the experts involved in trying to rank the sites. Google would introduce new algorithms that would sniff out link spamming and the search engine optimization companies would in turn try to find out new ways of ranking by doing as little work as possible and in the meantime maximizing their profits.
Today, Google search and Bing are more sophisticated and are intelligently able to identify link spamming and blacklisting websites which use such deceitful tactics. A website owner today might find it difficult to find and hire an ethical search engine optimization company and one wonders what criteria to use when selecting the right company or to try to avoid an unethical company, which might after all, even damage a website and reputation irrevocably.
When outsourcing SEO work one has to avoid certain traps, the most common are the creation of shadow domains that diverts traffic to other sites. This is done in order to blackmail a website owner to keep working with such a company forever since the termination of the contract would lead the SEO company to redirect the traffic to a new client and the website owner would have forked out money to finance someone else’s success.
Another unethical practice involves the creation of bridge pages (also known as cloaking) which involves a technique whereby search engines are shown one page with lots of keywords whilst human visitors are shown the real page. This technique produces results fast but once search engines catch up with such websites the resulting blacklisting will lead to an irreversible loss of traffic and might lead to a catastrophic loss of business.
Webmasters also have to be wary of companies that promise number 1 rankings on Google for even the most competitive keywords in a short time. No one can guarantee such rankings and as anything else in life if you think promises made are too good to be true than the probability is that such an SEO company is phony and will probably under deliver.
Google’s guidelines have long focused on creating superior quality content aimed for human consumption and not just stuffed with keywords to impress the search engines. SEO is a process that takes time and should be considered as a marathon and not a sprint, when SEO is done correctly results are long lasting and targeted traffic will keep reaching a website for years.
A list of guiding principles released recently by Google’s anti-spam team show that search engines are today moving towards analyzing contextual meaning rather than just extracting particular keywords from a webpage. This means that search engines are more likely to rank websites which offer particular value to their visitors rather than websites which act as a brochure by simply providing basic information about a particular service or organisation.
Today search engines are trying to recognize the main subject from a particular webpage or blog post therefore content written with SEO in mind should be focused. If a website is tackling various subjects it is better to have every topic diversified on each page and avoid unrelated keywords on the same page. Long gone are the days when SEO used to target one word keywords, it is a known fact that search engine users are today typing in full sentences and questions in natural language and in turn expect clear answers. A good SEO-strategy should target such ‘long tail keywords’ to ensure that natural language queries lead visitors to their websites.
Search Engine Optimization rules are changing and one has to keep abreast with any developments, most SEO experts today look at any developments by Google, mainly because this search engine processes around 85% of all search traffic and therefore it makes business sense to focus on it. However it is important to follow the tactics mentioned above, mostly known as ‘white-hat’ techniques and pursue long-term success rather than chase some loophole which might produce quick results but will see all your success crumble in a matter of weeks once Google catches up with such tricks used by ‘black-hat’ SEO companies.
Copyright notice : This article was written by Ian Vella and published on the Sunday Times of Malta. Copyright may be shared between the mentioned author and entities. Please do not republish without permission.