market what’s


The city burst element located on the left side of the CBD Marketing logo and the agency website.
January 17, 2020
The Next Evolution of Search Engine Optimization
Mark Robinson


By the time Google began its ascent to dominate the search industry in the early 2000s, a new industry revolving around search engines was already gaining traction. Smart marketers quickly recognized that if their websites ranked at the top of the search results, it meant free traffic to their websites.  And so the SEO industry was born.

In the early days of the industry, “black hat SEO” (think cowboy movies where the bad guys wore black hats) was much more prevalent. Black hat SEO involves tactics employed to game the system in order to achieve improved search engine results page (SERP) rankings. These tactics are considered unethical business practices and, if discovered, they could result in a ranking penalty assessed to the offending website.

One of these tactics was called "comment spamming". This was a practice by which a spammer would post promotional messages in the comment section of a blog, on message boards, in web forums, online guestbooks, wikis, and the like (collectively referred to as "user generated content", or UGC for short) that included links back to the spammer’s website. And because Google’s PageRank algorithm (named after Google co-founder, Larry Page, not the web pages the algorithm ranks!) is rooted in the philosophy that a link from website A to website B is considered to be an endorsement that passes PageRank value (a.k.a. “link juice”) from website A to website B, this was an effective black hat tactic for improving SEO rankings.

Obviously, website A is negatively impacted in the scenario above, because website B is siphoning off website A’s link juice, decreasing A’s ability to rank well. But Google also saw this as a risk to their own business. If spammy websites with low-quality content began showing up at the top of Google’s SERPs simply because they were gaming the system, users would eventually just switch to an engine that provided better results. So in 2005, Google created the “nofollow” link attribute to combat this issue. Simply put, it’s a small snippet of code (rel="nofollow") that can be put in front of a link to tell Google not to follow that link or pass any PageRank value to the site to which it links. As a result, spammers would be discouraged from posting spammy links in UGC areas, because they would no longer reap the SEO benefit that was motivating them to do so in the first place.

There was a downside to introducing this code, though. It didn’t take long for SEO pros to realize that using the nofollow attribute on all of their outbound links would help them retain as much SEO value as possible on their own site. And while Matt Cutts announced in 2009 that the algorithm had been updated to prevent the hoarding of link juice, it would stand to reason that many webmasters and SEO pros were (and continue to be) unaware of this, and they continue with this strategy today. Otherwise, would Google have much reason to change the way link attributes work in the algorithm? (Unless they simply see it as their Everest, and are doing it to to prove that they can). Nonetheless, if the nofollow attribute is still being used by many sites as a link juice hoarding tactic, then Google’s algorithm is still less than optimized.


Addressing the issue would be no small task. But behind the scenes, Google has been putting pieces in place that are surely part of the fix. Over time we’ve seen updates demonstrating Google’s continuous pursuit to understand the context and quality of a page’s content, and ultimately that content’s PageRank value, regardless of the existence of links. Examples include:

  • The Panda update (Spring 2011): Introduced the ability to evaluate and rank websites based on various indicators of content quality.
  • The Hummingbird Update (Summer 2013): Demonstrated the ability to interpret the user intent behind a query, and return webpages that would be the most qualified in solving for that intent.
  • The RankBrain Update (Spring 2015): In October of 2015, Google announced that artificial intelligence that seeks to understand user intent behind queries had been a part of the algorithm for months. This update improved upon Hummingbird by adding a AI learning component that adjusts rankings over time based on user interactions with content that indicated usefulness related to the context of a search query, even if the content didn’t contain the exact phrase being searched.
  • The Implied Links Update (Timing uncertain): This isn’t documented as an official update, but at some point prior to September of 2017 when the capability was acknowledged by a Google search insider, Google added the ability to recognize unlinked citations into its algorithm. This is also confirmed in step 302 of Google’s patent for ranking search results.


So this is how it went in the 14 years following the introduction of the original nofollow link attribute. But on September 10, 2019, Google announced on its Webmaster Central Blog that, “It’s time for nofollow to evolve.” The post introduces two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links. First, rel=“sponsored” dictates advertisements or sponsorships, while rel=“ugc” marks links within user generated content (i.e. forum posts, comments, etc.). Then, after introducing these two new link attributes, the post goes on to say that all three of the link attributes are being treated as hints, relative to which links to consider or exclude within Search. Google will use these hints — along with other signals — as a way to better understand how to appropriately analyze and use links within their systems.


Okay, maybe the future isn’t here just yet. When I started thinking about all of the capabilities behind the updates listed above, I started to question why Google didn’t just do away with link attributes altogether. But I guess — despite the advances that’ve been made in natural language processing, semantic search, query intent, and artificial intelligence — by saying they’ll continue to use the link attributes as a way to better understand how to appropriately analyze and use links within their systems, they’re acknowledging that there’s still room for improvement. And with just an idea of the amount of data that Google collects every day related to search, I’m guessing that it won’t take long (relatively speaking) for the Google AI to perfect their interpretation of links without the use of attributes, and the Future of SEO will finally be here.

Share this article