A History of Google Search Algorithms
Nearly everyone uses Google to search for what they need on the web. In fact, well over 80% of consumers perform product and service research using search engines. Many of us can't even remember or imagine a world where answers, dinner, and branded solutions were any harder to find than a search on our smart phones.
Google uses code written for two different updates to their search algorithms to separate the good results from the bad; Google Penguin and Google Panda. While the world's biggest search engine can launch in excess of 200 algorithm updates per year, these two are by far the largest in recent memory. Many algorithm updates have only a marginal affect on search results, while Panda and Penguin's effects were both profound and far-reaching.
Google Panda was launched in February 2011, and was designed to filter out sites with little to no content. The intent was to replace these negative and useless search results with sites that offered high quality content. After its implementation, news sites and social networking sites began appearing closer to the top of search results, and sites with more advertisements than information were thrown to the bottom.
For a truly in-depth, comprehensive overview of the history of Google algorithm changes, we recommend Moz's timeline.
The Problems with Google Panda
The only glaring problem with Google Panda was that it couldn't distinguish between sites with original content and sites that had copied content from other sites. When reports of plagiarism began to circulate, it was clear something needed to be done, and really quickly. Google was quick to respond by asking sites to provide data for copied content in order to better analyze sites with and without copyrighted content.
Google Penguin originated next, in April 2012. Its goal was to target sites that used a black hat SEO tactic called a "link scheme". Through purchase or other means of control, marketers attempted to manipulate their page ranking in Google's search by increasing the amount of links that point to a particular page. Penguin was designed to penalize sites which existed for the sole purpose of attracting search engine attention, without providing any true or unique value to Google's customers, web search users.
What Did Panda and Penguin Mean for SMB Marketers?
With rare exceptions, these algorithms were a positive factor for SMB and enterprise inbound marketers. If you were doing inbound marketing correctly; creating high-quality content to provide unique value to your prospects and customers, your site likely saw only marginal decreases in ranking or traffic. More likely, your standing in search engine results pages (SERPs) increased.
Do inbound marketers need to study - and understand SEO? Absolutely. Only a few individuals know all of the hundreds of factors that Google uses to judge the quality and usefulness of a website. However, Google's own quality guidelines make it pretty clear what search users need, which is relevant information and ease-of-use on computers, tablets, and smart phones. Even if you're a social media manager or content marketer, it's crucial to read and absorb the webmaster guidelines, in order to serve your customers best.
Source: Marketing Signals
This infographic and post were created by a guest blogger, Megan Barnes of MarketingSignals.com.
Guest Author Bio: Megan Barnes is in the final year of a marketing degree. She particularly enjoys keeping up with the latest innovation in online marketing best practices. Megan has recently started sharing her research and insights online through blogging which she finds both enjoyable and rewarding as it forces her to stay abreast of the moves in the industry.
image credit: marc blickle/flickr/cc