The SEO zoo – how to avoid pandas and penguins?
As discussed in the earlier posts, Google has been keeping an eye on the all the SEO techniques used and it keeps updating its system of refining the search results. The ultimate goal of Google is to make sure that the search results returned to the end user should be relevant. It wishes to provide the user varied relevant content with the ease of surfing the website.
Google Panda and Penguin updates have had a widespread impact on ranks, with several websites dropping in ranks. These updates have had multiple algorithm changes being pushed out – each impacting different search engine ranking parameters and different markets. Customers use a search engine to search for products and Google is the largest search engine in several markets – because of this, many businesses have had a significant revenue impact. The verdict about these updates is mixed. There is definitely an improvement in overall search quality – spammy sites have been punished, over-optimized sites which were focusing on search engine bots and not on customer experience have also been hit. However, there are quite a few examples on the internet where sites which were following all that Google was trying promote with Panda and Penguin have also been penalized.
The main aim of these updates is to check the overly-optimized and user unfriendly websites, and to force them down the search results.
Google Panda seeks out low quality websites, both in content and structure whereas Google Penguin seeks out the spam websites. The Panda update ensures the relevance of the content and a site's ability to indicate that it appears high in the search results whereas the Penguin updates make sure that the website structure is user friendly.
Envigo's clients have been largely unaffected by these updates. In fact, we have seen a lot of success in ranks when Google has penalized some of the competition due to Panda and Penguin. Envigo's has a two fold strategy – a reactive and a proactive approach. The reactive approach has been to gather information about any algorithm update from published sources and interviews of people like Matt Cutts (from the Google search quality team) . The SEO team then reviews the potential impact on various websites and comes up with a set of changes to the SEO process. These changes are pressed into action to minimize any impending loss in ranks. The proactive approach is to follow an overall SEO philosophy which is closely aligned with a search engine. What every search engine wants is to provide quality content to its users for relevant search results. The proactive approach is to help search engines with their goal by improving a client site and a client's content from an end-users (not a search engine robot's) perspective.
Our SEO strategies have worked out well so far. Fingers crossed.