Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
- Links on the web count as votes. Initially, all votes are equal.
- Pages which receive more votes become more important (and rank higher.)
- More important pages cast more important votes.
But Google didn’t stop there:Β they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010βand to be honest, it rocked our world. Parts of his original have been heavily borrowed here, andΒ Rand graciously consulted on this update.
In this post,Β we’ll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented.Β Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can oftenΒ outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, “Nothing is better for learning SEO than going out and experimenting in the wild.”
Contents:
- Links From Popular Pages
- Links Inside Unique Main Content
- Links Higher Up in the Main Content
- Links With Relevant Anchor Text
- Links from Unique Domains
- External Links vs Internal Links
- Links from Sites Closer to a Trusted Seed Set
- Links From Topically Relevant Pages
- Links From Fresh Pages
- Rate of Link Growth
- Spam and Low-Quality Links
- Link Echos
- Linking to Authoritative Content
- Pages That Link To Spam
- Nofollowed Links
- JavaScript Links
- First Link Priority
- Robots.txt and Meta Robots
- Disavowed Links
- Unlinked Mentions
1. Links From Popular Pages Cast More Powerful Votes
Letβs begin with a foundational principle. This concept formed the basis of Googleβs original PageRank patent, and quickly help vault it to the most popular search engine in the world.
PageRank can become incredibly complex very quicklyβbut to oversimplifyβthe more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links.
In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links Inside Unique Main Content Pass More Value than Boilerplate Links
Googleβs Reasonable Surfer, Page Segmentation, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the βboilerplate.β
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Googleβs Quality Rater Guidelines encourage evaluators to focus on the βMain Contentβ of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
- One placed prominently in the first paragraph of a page, or
- One placed lower beneath several paragraphs
Of course, youβd pick the link visitors would likely click on, and Google would want to do the same. Googleβs Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Googleβs Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Googleβs Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role.
Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a pageβs ranking better than generic or non-relevant anchor text.
Itβs important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether.
Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Googleβs patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that itβs far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Googleβs algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a siteβs raw number of total links.
Rand points out that this principle is not always universally true. “When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it’s almost always more rank-boosting and marketing helpful to go with another NYT link.”
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
Thatβs not to say that internal links donβt count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said,Β building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here.
More recently, Google updated its original PageRank patent with a section that incorporates the concept of βtrustβ using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputationβe.g. through online reviews or sentiment analysisβand use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life)Β pages that “impact the future happiness, health, financial stability, or safety of users.”
This means links from sites that Google considers misleadingΒ and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
- The National Dairy Association
- The Association of Automobile Mechanics
Hopefully, you choose βaβ because you recognize itβs more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity.
These concepts also help discount spam links from non-relevant pages.
While I’ve included the image above, the concepts around Google’s use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:Β
- Topical SEO: 7 Concepts of Link Relevance & Google Rankings
- More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content basedΒ on freshness. One way to determine the relevancy of a pageΒ is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pagesβe.g. newer pages and those more regularly updatedβare likely more relevant than pages with links from mostly stale pages, or pages that havenβt been updated in a while.Β
For a good read on the subject, Justin Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared toΒ a page that has had the same old links for the past 10 years.Β In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It’s important to note that “old” is not the same thing as stale. A stale page is one that:
- Isn’t updated, often with outdated content
- Earns fewer new links over time
- Exhibits declining user engagement
If a page doesn’t meet these requirements, it can be considered fresh – no matter its actual age.Β As Rand notes, “Old crusty links can also be really valuable, especially if the page is kept up to date.”
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Googleβs Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you’re about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them.
Googleβs goal is to focus on editorial links, e.g. unique links that you don’t control,Β freely placed by others.Β Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they donβt feel meet these standards. These include links generated by negative SEO and link schemes.
That said, thereβs lots of debate if Google truly ignores all low-quality links, as thereβs evidence that low-quality linksβespecially those Google might see as manipulativeβmay actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isnβt wholly dependent on its existence on the page, or factors we canβt quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Pages Linking to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isnβt an explicit ranking factor, theyβve also made statements in the past that it can impact your search performance.
βIn the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.β β Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as thereβs ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren’t Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites – especially those generated by comment spam and UGC.
A common belief is that nofollow links donβt count at all, but Googleβs own language leaves some wriggle room. They donβt follow them absolutely, but βin generalβ and only βessentiallyβ drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. Many JavaScript Links Pass Value, But OnlyΒ If Google Renders Them
In the old days of SEO, it was common practice to βhideβ links using JavaScript, knowing Google couldnβt crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
… Or more specifically, only the first anchor text counts.
If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies havenβt been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in itβs ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Googleβs way. These include:
- The URL is blocked from crawling by robots.txt
- Robots meta tag or X-Robots-Tag HTTP header use the βnofollowβ directive
- The page is set to βnoindex, followβ but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google canβt actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Donβt Pass ValueΒ (Typically)
If youβve built some shady links, or been hit by a penalty, you can use Googleβs disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks youβve made a mistake with your disavow file, they may choose to ignore it entirely – probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of informationβeven when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). Itβs a heady subject, but suffice to say Google doesnβt always need links to associate data and websites together, and strong entity associations may help a site to rank.
We’ve included all hi-resΒ images from this postΒ for folks to use with clients/reports/presentations or their own blog.Β Please credit Moz when using any of these images.
Below, you’ll find all twenty principals combinedΒ into a single graphic.