Looking at the basics of Search Engine Optimisation we have noticed that a lot of people have known the basic words or jargon in the industry and consider themselves as SEO experts. While what they know maybe a part of SEO, it is never sufficient to fully understand the process and make a website reach top rankings in google search results. For example, when talking about the various methods through which to help the google spiders to navigate through the site, it is easy to point out to the robots.txt file. This file contains the domain level guidance to the web spiders and direct them as to which pages to visit. The content of this file is very important. But there are much more subtler technologies at work here. We at brandvois know that this simplistic approach is not the best way to go and that the google search algorithm has been updated to make sure that the content of this file alone does not affect the rankings. Using meta directives that will provide a page by page set of instructions to help the web spider to navigate is much more effective and a much more reliable way of reaching better rankings.

The Meta Directive Approach

In the earlier days of the google search engine a concept called cloaking was used to fool the search engine to deliver false content to the person doing the search. This was done by the process of cloaking, where a server was set up in such a way that a page will be viewed differently by the human user and totally differently by the google bot. The reputation of google as a business took a severe beating whenever a cloaked site was delivered to a person using google search engine. Google fought back by blacklisting these sites and updated their algorithm and in certain cases included human verification to provide better results to the user. This process of cloaking is something that is sometimes followed by people around the world hoping to reach higher rankings. At brandvois, we are in a position to know what must be done and what must be avoided at all costs. The situation is such that the list of what not to do is equally important as what to do when it comes to search engine optimisation. To avoid such penalties there are numerous other ways that we would suggest to reach better rankings in the google search results.

The Cloaked Mystery of SEO

Last week was very eventful as we got quite a few questions from our clients regarding the cloaking post. Some of them were launching new websites for european customers and their webmaster’s suggestion of using cloaking to direct new customers to their corresponding sites started alarm bells in their heads. To get clarity about the situation it is important to understand the fine line that keeps cloaking within the legalities of google’s rules. Cloaking can be used for re directing users to a website that contains content in their own language. It is within the limits set by google. If your website has to service clients across different countries then it is ok to use cloaking and direct them to different sites. By taking a look at the score that the google bot agent that is part of your webmaster’s tools and comparing it with a human user’s score the line between legality and blacklisting can be defined better. The same concept is applicable for the versions of the site created for mobile devices. These are considerably different from the normal pages and cloaking is the simplest option to maintain the domain name and serve different clients all over the world.

Multi lingual sites and cloaking

Recently there have been more awareness about SEO and since we have been at it for more than a few years, a lot of enquiries have started coming through. Our new clients’ websites have been existence for over ten years and yet have not reached the relevant heights in ranking. Taking a look at the sites we were surprised to see the absence of keywords and general description that are absolutely required to be noticed by the google search bots. It became tricky to place the keywords in the website as it might be flagged down as a spammy website. There was a lot of discussion about how to go about sorting the issue and good old business logic came to our rescue. After a detailed analysis of the website we came up with a strategy that was unique and exhilarating. A step by step procedure was created by our team for websites that have not been built with SEO on the agenda. Even though it was a prototype process, years of being in this domain has given us the edge to adapt as per requirement. The results have started coming in on these websites and we are really happy that we were able to push boundaries.

Built for Optimisation

One of the many steps involved in getting a page ranked higher is getting the keywords right and developing the content in the website that is relevant to the keywords that is being targeted. Even though a lot of companies and web developers give prime importance to these factors there are too many smaller details that are ignored. It is these details that make a website strong in terms of SEO and helps it to remain stable in a high position for long periods of time. The google search engine and the google spiders are fascinating in the way they work and when we look at the way they move through the pages it is easy to notice the similarities to human reading habits. Moving through content from left to right and giving more importance to content at the top is what humans do. The google web spiders go through a html page in the same way. We have noticed over the years in SEO that the layout of pages has a role to play in the way the page is ranked. While it may not be a deciding factor, it is the small concepts like these and details that make a website stay high in google’s rankings.

The Layout factor for SEO

Just like the way google revolutionised the way internet was being used, facebook has done something similar. By accumulating hundreds of millions of people on to one platform the people are facebook have created an area teeming with activity and where one can reap rich rewards by working smart and fast. We have always kept an eye on the growth of social media circles and the explosion of facebook has resulted in some new angles of approach towards SEO. When we started looking at the social media circle starting from Orkut and many such sites there were not much options available for webmasters to integrate them as a part of their SEO strategy. But the change has happened and right now companies have come up with applications that aid in using the social media platform for the same purpose. For example, facebook has come up with the application called facebook insights and at brandvois we always adopt the new technologies – anything and everything that can give our clients the edge over the competition. Social media is changing the way the net is perceived and we are always ready to adapt the technology to serve our clients towards all ends.

SEO from the social media angle

One of the many sites around in the internet offer services to host content and offer a unique link to that content. During one of our recent interactions with clients we noticed certain interesting points when it comes to external links to the home website. There were many pages with links like https://sample.blogspot.com across various domains like tumblr, blogger etc. The problem with these links were that even though the content was rich and had a lot of relevance to the home page these attributes were not linked or transferred to the home page. These are simple misconceptions that people often have about content and the ranking involved in them. We always advice the webmasters to reverse the proxy and link all the content to the websites. The initial analysis of the team of engineers at brandvois interact with the client and make sure that there are no avenues left explored or unattended. We always make sure that every bit of information that can help is obtained and the ancillary activities to add value to a website are taken care of. This is part and parcel of our procedures and the service that we offer our clients.

External data – A useless addition

One of the most recent revelations about the way google operates happened because of a document that was circulated in SEO circles. The document showed the categories that web pages were classified into for a given search query and gave valuable insight into how to structure a SEO process. We were happy and relieved to see that the classification was very similar to what we have developed over the years. This classification is necessary for us to understand the dynamics of the internet and to make sure that the right keyword is being used in our clients’ websites. The classification ranges from vital to useless and passed through three stages in between. By making sure that the back links and other affiliated actions are directing to proper pages and content we are able to provide higher relevance to the websites. A part of our processes includes link building and this understanding of google has helped us a great deal in the past and reinforces the belief that we have in our own system. Proper analysis is required to understand the relevance of the web page to google. While a page might be important to for certain keywords it might be useless to others. Understanding this difference is very vital to properly optimse a website.

The google way

Google has always tried to stay ahead of people trying to fool its search engine and optimise websites using just keywords and link farms. There was once a time, when people used to show different things to the google web spiders and different content to the actual user. There were certain people who tracked down the IP addresses that google’s bots use. Google put a stop to that by launching from various IPs and using human users to check the relevance of websites. This put paid to the methods that were used earlier. The drive behind google has always been to make a website as friendly to a user and relevant content was given higher priority than any other factor. These days there is penalty in the form of blacklisting the website that does such nefarious activities. We as part of the SEO community welcome such measures on part of google. Even though these methods provide quick results and gets the client’s website higher in the ranking, it is very very harmful for the client in the long run as visitors get upset over the website that is not providing relevant content. At brandvois we stick to the rules and play to win by using all our available resources and making the websites that people will remember and visit with or without the help of a search engine.

The common sense approach

When there is so much of content available on the net, how do we get a person to notice and follow a link? While there are many ways to do this, a few are ethically wrong and do not always give the best results. The process of link baiting is always an interesting aspect of SEO and it takes a lot of creativity and understanding of people’s behaviour to come up with a good link baiting article. Link baiting involves the process of attracting people so that they read through to the articles from a simple link.

When thinking about it a little more deeply we can understand why the title of an article is most important. We have over the years understood this innocuous little detail and many more similar details. Getting a person interested with titles like “leaked specifications of the next PlayStation“ is easy, but the disappointment that comes after clicking on the link is something that we never want people to feel. That is where brainstorming and the creative genius of the team at brandvois comes in. During these sessions, we come up with creative ways of identifying and constructing titles that will attract more and more customers without causing any frustration.

To bait or not to bait