By now, we are all aware of the importance of search engines to our online life. Search engines like Google provide up with an indispensable tool to assist up in looking for specific pieces of information buried within millions and millions of webpages that constitutes the World Wide Web. Just type in the keyword or phrase, press enter, and in an instant, a list of results for our convenience.
But the question is, how are search engines able to give up relevant results every time? Considering the numerous webpages out there, how do these search engines determine exactly the pages that contain the information that we are looking for? Well, it wasn’t easy and it was a product of years of experience, webpage analysis, user feedbacks, and algorithm modifications or updates.
Google, for instance, was able to come up with two significant and innovative updates: the Google Panda and Penguin updates.
Google Panda update, first released last February of 2011, aimed to filter out “low-quality” or “thin” sites by lowering their ranks in the Search Engine Results Page or SERP. The Panda update, named after engineer Navneet Panda who made the algorithm possible, uses sophisticated artificial intelligence to determine which sites are considered to be high-quality and containing relevant content that users will find useful based on the keyword or phrase they typed.
Unfortunately, the downside of this update was that websites that contained copied or duplicated content from several websites ranked high on the SERPs while those that contained the original content ranked low.
This is where the Google Penguin update takes over. First announced last April 24 of 2012, it aims to decrease the rankings of websites that uses black-hat SEO tactics. These techniques include keyword stuffing or using keywords that are not even related to the page’s content, cloaking or hiding keywords within the page usually by using font color that is the same as the background color, link schemes or link buying to redirect traffic to the site, and creating content or text that was copied or duplicated from other sites.
Even now, Google keeps modifying its search algorithm in response to new black-hat SEO tactics being used to improve the website’s SERP rank. This is to ensure a consistent display of relevant searches. Google even said that on average, they make almost one and a half changes to its search algorithm per day to refine search results.