How SEO Has Changed

SEO has changed significantly over the past decade, transforming from a simple understanding of algorithms into a discipline unto its own. Search engine optimization today primarily deals with machine-learning, context-sensitive systems that are intentionally designed not to be fooled. Spam, over-optimized pages, and to a certain extent even keywords are all becoming defunct in the modern era.

Google’s Quality Update in 2011

The face of modern search engine optimization began to change in 2011, with the Google Panda update. In prior years, search engines had relied purely upon search term relevancy. Users would enter in queries and the search engine would simply try to return sites that were most relevant to those queries. In practice, this made it fairly easy for search engine optimizers to game the system. In a simple context, someone looking for “socks” would simply be looking for the website that mentioned “socks” the most. Panda update changed all of this. Instead of simply targeting relevant websites, Google’s update attempted to determine the actual quality of that website. Instead of just posting relevant results, Google wanted to post high-quality, high-influence results. Google was able to leverage its PageRank system to cut down on unhelpful websites and was able to move low quality and duplicated content to the bottom of the results queries. But Google wasn’t yet done.

Google’s Penguin Update in 2012

The Google Penguin update was believed to impact only approximately 3.1% of the search queries on Google, but it was targeting a specific area: websites that were posting spam and using black hat SEO techniques. Following the Panda update, the Penguin update was able to greatly improve the overall quality of search results. In this way, Google was able to dissuade many techniques such as keyword stuffing. Since 2012, Google’s spam algorithm has been updated multiple times, each time targeted at taking out “low value” content from the search engine giant. This has made it more difficult for marketers to quickly deploy websites and gain traffic using tricks; instead they have to focus on high quality content.

What Hummingbird in 2013 Was All About

Hummingbird was an algorithm update that primarily affected conversational search. For instance, if you were to say “Where is the closest dealership to buy a Hyundai?” A traditional search engine would look for keywords such as “dealership,” “Hyundai” and “buy.” Hummingbird’s purpose was to find the actual meaning behind the words, understand the sentiment and intent of the user. Users speak very differently out loud than they type, which leads to longer tail keywords with more sentiment cues and semantic relation. Additionally, the algorithm couples these contextual cues with location data to effectively serve geographically relevant search results.

The Mobilegeddon Updates

In May of 2015, Google released a new algorithm intended to derank websites in mobile search results that were not mobile friendly. Google’s reasoning for this was to provide a better user experience for their search engine users. In addition to insuring the website was mobile responsive, this algorithm update also focused on technical factors like mobile page load times, render blocking javascript and mobile user experience. This wasn’t the end of Mobilegeddon. Google released a second round of the algorithm in 2016 to scrub search results once more of websites with poor mobile user experiences. The focus has more emphasis on technical factors, providing factors like mobile page load times with more weight.

Machine Learning and Rank Brain

Google announces a new machine learning algorithm running in real-time is released several months after launching, contributing to the 3rd most influential ranking factor. *Note: It is believed that the actual launch was closer to spring 2015.

February 6, 2017 and Fred (Unofficial Name)

Perhaps the most paramount algorithm updates in several years, causing enormous commotion throughout the online SEO community. The February 6th update was believed to have been focused on targeting websites with low quality content and other user experience signals based on Google’s Quality Rating Guidelines.
Shortly after, another update, unofficially dubbed Fred, shook the SEO world which was specifically believed to be focused on low quality content.

How SEO Has Changed Due to Google

Following Google’s updates, it’s become necessary for websites to maintain strictly high-quality content. Websites are now rated not only on relevancy but also on trust; websites that have a high level of authority are far more likely to be returned first on search engine results. Further, marketers need to be aware that it’s easier than ever to become black-listed on Google’s search engine. A strict adherence to Google’s terms of service is absolutely necessary for marketers who want to keep their sites healthy. Marketers need to be aware of how SEO has changed and the current best practices. SEO is no longer a matter of simply inserting keywords and long-tail keywords, or ensuring that the website is appropriately crawled. Instead, SEO has become a mission of understanding the content that Google wants: long-form, high-quality content that is distributed through multiple social media channels and links. By building trust and authority, domains are able to quickly improve upon their search engine rankings.