Browse By

A Look Back at the 2014 Google Search Engine Algorithm Updates

For businesses living partially or entirely within the web, Google algorithm updates or new releases are anticipated events. Whether the anticipation is positive or negative, however, depends on the work a business has done to solve issues caused by any penalties from a previous algorithm, or the continuous positive work toward providing a user friendly experience with great content and easy usability. Google rolled out new applications, updates and refreshers, and removed certain functions during the course of 2014 to help give users the best possible search experience. SEO companies and web design agencies do well to keep themselves appraised of the changes to reflect current best practices.

 

February – Page Layout

The official Google announcement cited user experience as the primary reason for the new page layout algorithm. Presented in January 2012, the page layout algorithm addressed the common user complaint that “if you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.” The “above the fold” page layout algorithm was created in 2012 as a type of reward algorithm for sites that placed content above the fold while leaving ads lower on a given page. This algorithm was pushed live in response to the increase in sites that had begun to place a larger amount of ads, instead of relevant content, toward the top of a given page.

Google did not want to penalize sites that “place[d] ads above-the-fold to a normal degree,” recognizing the need for some ads to be higher up on a page because they helped “publishers monetize online content.” Which is why the initial version of the page layout algorithm affected less than 1 percent of global searches.

In October 2012 there was a small update to the original version, which reportedly affected less than 0.7 percent of English language searches.

No updates had been made to the page layout algorithm until early 2014. At the beginning of February Google announced a refreshed “above the fold” page layout algorithm. Following along the same path as the previous updates, the purpose of the newest installment was to more aggressively target sites with a lot of advertisements at the top of each page – again with the goal of ensuring a positive user experience.

 

May – Payday Loans & Panda

Around the middle of the month Google updated its original payday loans and Panda algorithms.

In an effort to reduce the amount of spammy results produced by certain search queries, Google set up a new anti-spam application in 2013 that focused on those spammy queries. Impacting about 0.5 percent of English language queries, the algorithm quickly became known as the “payday loans update” because Google’s Distinguished Engineer Matt Cutts not so subtly “mentioned that the change would affect queries such as ‘payday loans'” in the announcement of the update.

Like the original payday loans update, the 2.0 version was specifically focused on particularly spammy sites. In a move to come down harder on international spam, this update also affected more international than English queries, although the percentage of those affected is largely unknown.

Google began to take serious action against sites featuring low quality content in 2011 with the first iteration of the Panda ranking algorithm. Panda was intended to target sites with poor quality and duplicate content to help produce the best user experience possible with search results that were relevant to a user’s query. The Panda update impacted about 12 percent of US search results, and while Panda did not “officially” target content farms and site scrapers, the update did seem to affect sites with less original content as well as those with duplicate content.

The Panda updates have generally been the ones companies watch out for and anticipate because these updates have traditionally affected search result rankings more than other changes.

Again focusing on high quality content for a better user experience, Google unveiled the newest iteration of its Panda algorithm. This version of Panda targets and helps prevent sites with poor quality content from entering the top search results in Google search queries, and this new update impacted about 7.5 percent of English language queries.

 

June – Payday Loans & Authorship Photo Removal

Quickly following the payday loans update in May, the new iteration of the payday loans algorithm attacks spammy queries, rather than specific sites like the second version was meant to target.

Late June brought the surprise removal of all authorship photos from search engine results pages. The purpose of the authorship markups was to “enable websites to publicly link within their site from content to author pages,” and would “include things like the author’s bio, photo, articles and other links.” Introduced in 2011, the popularity of the authorship markup was much less than Google had hoped – there was low publisher and webmaster adoption throughout the years. Since the markup was pushed live, optimizing an authorship photo could help increase rankings – at times Google would display authorship photos in 22% of all searches.

The reasons behind the authorship photo drop are a bit blurry, with the official response being that the Google was working toward streamlining the look of search results, as well as to create a “better mobile experience and a more consistent design across devices.” Other theories include the idea that the photos distracted users from advertisements, costing advertisers clicks, and the need discovery of a decrease in the quality of the search results as produced by the addition of markups.

 

July – Pigeon

According to Search Engine Land, the new Pigeon algorithm introduced by Google helps users find “more useful, relevant, and accurate local search results that are tied more closely to traditional web search ranking signals” through the use of improved parameters for distance and location ranking. These traditional signals more than likely refer to navigable and easy to understand URL structure, backlinks, domain authority, title tags and other common ranking factors.

Pigeon is an algorithm aimed at providing local results that are more accurate and relevant to what the user is searching for. If someone is searching for the best restaurants in Atlanta, the discerning signals used by Pigeon will indicate those search results that best fit the query and location. Unlike many of the more well-known updates like Panda and Penguin, Pigeon was not created with the intent of penalizing sites. That is, it was not meant to clean up the search engine results pages from low-quality content. This update was meant to be “a core change to the local search ranking algorithm.” With Pigeon, local listing packs (those particular results that also feature snippets like the address and phone number) are no longer associated with as many keywords as they once were, which provides far fewer results in an effort to return those that are most relevant to the query.

As always, Google credits Pigeon with being an algorithm intent on providing the best, most useful search results to users with the goal of creating a positive user experience.

 

August – HTTPS/SSL & Authorship Drop

Google uses a lot of signals to determine how to rank sites, and in early August secure sites became one of the newest signals to be used. While Google says it is a “very lightweight signal” – it will impact less than 1 percent of queries – Google also hints that the weight of having a secure site could increase in importance in the future.

Highly publicized cases of Internet security failures are well-known to most people. Hackers, computer viruses, firewalls, and other terms once recognized only by tech-savvy individuals are now part of everyday chatter among a lot of people. Internet security has become a priority when it used to be an afterthought. Because of the heightened threats users face while traversing the digital terrain of the Internet, Google decided to use HTTPS as a ranking signal. This move gives both a minor bump to sites that use encryption techniques like HTTPS, as well as an encouraging nudge to other websites to get on the secure-site bandwagon.

Unlike the surprise decision in June to drop the photo from the authorship markup, the full removal of the feature in August was less of a shock. The original intent of the authorship markup was to allow the linking of content from within a site to other pages an author owned or was a part of. However, for reasons ranging from concern over visual aesthetics to the fact that the popularity of the markups never really increased from the small initial excitement, the authorship photo aspect of the markup was dropped. This change paved the way for the almost inevitable drop of the entire authorship markup in late August. Google’s John Mueller stated that the reason for the full drop was that the “information isn’t as useful to our users as we’d hoped, and can even distract from those results.

 

September – Panda

About four months after the much-anticipated release of the Panda 4.0 algorithm, Google rolled out another version of Panda. The original update, which was released in 2011, targeted sites with low-quality, thin content and sought to reward sites that actively updated their pages with quality content. The most recent version, dubbed Panda 4.1, is estimated to have impacted up to 5 percent of search queries. Through user feedback Google was able to find more signals that could help pinpoint with striking accuracy sites featuring low-quality content. Pierre Far of Google stated that the implementation of the new signals created as a direct result of these findings “results in a greater diversity of high-quality small- and medium-sized sites ranking higher,” which helps increase the visibility of unique, lesser-known sites in search results.

Panda 4.1 seems to have used new signals while beefing up the old signals that targeted issues like keyword stuffing and content farms. Many sites that had used these black hat techniques were effectively brought low by the Panda 4.1 algorithm. Also affected were sites that contained duplicate or thin content, which have been traits Panda has gone after since the start of its existence in 2011.

Small and medium sized sites that provided fresh, relevant content were also impacted, but in a positive way in the form of an increase in rankings.

 

October – In The News, Penguin, & Pirate

Most people are pretty familiar with the news box feature in Google. The box would show up in web search results and contained the most current news from well-known outlets like CNN and BBC. In October Google replaced that application with a different version called the “In The News” box which features content from traditional news sources as well as content from other sites that may or may not be considered newsworthy or credible. Unlike the older version, the “In The News” box is not managed by a person, which raises concerns because virtually any kind of information can be pulled into the box. Google affirms that fact with their statement following the release of the feature. “We will be pulling from all over the web which means that we will present as diverse a range of voices as possible to ensure we get users to the answer they are looking for.”

The effects of this update were seen mainly by news outlets and was not meant as a penalty.

The first version of the Penguin algorithm, which was introduced in 2012, sought out websites that exhibited evidence of spammy activity in the form of external links. Huge drops in search engine visibility were served up to sites that Penguin identified as having deceptive links set in to content with the sole purpose of increasing search rankings. Google has made it clear over the years that user experience and security is of the highest importance. Sites that contain fresh, relevant content with authoritative links that occur naturally within a page would be rewarded, while sites that had thin, duplicate content and links that did not add value to a page would be punished.

Penguin 3.0 also targeted sites with links that were irrelevant or manipulative, but rather than producing a wide impact radius of affected sites (less than 1 percent of English language queries were hit), this global version of Penguin was more of a refresh to the older updates rather than a massive overhaul with new signals in place.

In 2012 Google began to take action toward Internet piracy with the Pirate algorithm. Using Digital Millennium Copyright Act “takedown” requests, Google created and added new signals to their ranking criteria that focused on sites with copyright infringements. Sites with a large number of these DMCA requests would potentially see a drop in the rankings.

Since the first iteration of the algorithm was released, Google has made updates to Pirate that help filter through more potential sources. Pirate 2.0 was a highly focused update targeting software and digital media piracy infringements. After this new version of Pirate was launched, a small target group of sites fell dramatically in rankings – up to 98 percent in some cases.

 

December – Penguin & Pigeon

The very first Penguin algorithm rocked the U.S. Internet world, impacting 3.1 percent of all queries. The main goal of the algorithm was to crack down on sites that used black hat tactics and spammy linking structures to increase rankings.

With each Penguin update businesses that had been affected would have the opportunity to regain some of their lost rankings – if they had taken the time to fix the errors that had gotten them penalized in the first place.

In December Google announced the new design of Penguin was to provide continuous updates more frequently. This suggests the updates will be smaller and will affect fewer queries, but will still focus on monitoring sites for keyword stuffing, spammy linking, and other black hat techniques.

Google introduced Pigeon to the United States in July with the purpose of providing more accurate local search results to users. Using improved indicators for distance and location, as well as traditional search signals, Google hopes to provide a more relevant search experience.

The Pigeon algorithm was expanded to the United Kingdom, Canada, Australia, and all other English-speaking locales except India in December. Just as in the United States, the global roll out of the Pigeon update is intended to provide these countries with more accurate search results.

Google continues to create updated versions of their original algorithms, as well as to come up with brand new ways to provide users with a more relevant, positive experience. Because recovering from penalties is not usually a quick process, it’s very important to always follow best practices and work toward a great user experience.

Leave a Reply

Your email address will not be published. Required fields are marked *