6 Ranking Factors You Had No Idea Existed

on under Search Engine Optimization.

We all know that keywords and links are cornerstones of SEO, but I thought I'd take today to talk about a few ranking factors that most SEOs have never even heard of. Let's get started.

1. Domain Expiration Date

Most SEOs are well aware that older domains are more likely to rank in the search results than new ones, but did you know that Google could be using your domain's expiration date as a ranking factor? According to this patent, they may be doing exactly that:

"Certain signals may be used to distinguish between illegitimate and legitimate domains. For example, domains can be renewed up to a period of 10 years. Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith."

2. Public or Private WhoIs

Matt Cutts has hinted that if your WhoIs data is protected, that could be used as a ranking factor against you, since it indicates that you have something to hide. At the very least, you can expect manual review teams to take this into account. Here's what Cutts said on the matter:

"Rather than any real content, most of the pages were pay-per-click (PPC) parked pages, and when I checked the whois on them, they all had "whois privacy protection service" on them. That's relatively unusual. Having lots of sites isn't automatically bad, and having PPC sites isn't automatically bad, and having whois privacy turned on isn't automatically bad, but once you get several of these factors all together, you're often talking about a very different type of webmaster than the fellow who just has a single site or so."

Some have speculated that Google might also view you with suspicion if the WhoIs owner had been penalized in the past, and this speculation certainly makes sense, even if we currently have no confirmation of it.

3. Content Length

This one's a bit more well known than some of the others, but still obscure enough that it's worth mentioning, especially since it's also a fairly influential one. According to serpIQ:

"For most SERPs it looks like at least 1500 words is a good target. This isn't a steadfast rule - you'll need to adjust this target to fit the niche that you're in. If you're working on your own projects or even on client websites, keep in mind that not all content is equal. If writing isn't your strong point, finding someone who can create compelling sales copy, blog posts or informative content is going to pay off in a big way down the road."

After analyzing a large dataset of SERPs, they found that, on average, position 10 search results had 400 fewer words than position 1 results. The effect was much more noticeable for mid- and low-competition queries than for highly competitive keywords.

4. SSL Certificates

An SSL Certificate ensures that transactional data stays safe and that it can't be snooped by hackers. Anybody involved in eCommerce should have an SSL certificate, regardless of Google. However, Google has confirmed that they index SSL Certificates, and it stands to reason that they would use this data as a ranking factor to distinguish eCommerce sites from publishers.

5. Link Age

Another Google patent suggests that the age of a link can be used as a signal of trust:

"According to another implementation, the analysis may depend on weights assigned to the links. In this case, each link may be weighted by a function that increases with the freshness of the link. The freshness of a link may be determined by the date of appearance/change of the link, the date of appearance/change of anchor text associated with the link, date of appearance/change of the document containing the link."

All too often, we see SEOs focusing on building a larger number of links or more authoritative links, when they should be placing more emphasis on building links that will last. When a link stays in place for a long period of time, it's much less likely to be a paid link, a spam link, or a link that was earned dishonestly.

6. User Behavior Data

Google has publicly admitted that it uses Google Toolbar data as a ranking signal. It has also been revealed that Google collects data from Chrome and it's hard to believe they don't use this data for ranking purposes. If Google Instant is enabled in Chrome, they receive everything you type into the address bar. Google also collects Bookmark, Gmail, and Docs data, though it's not clear to what extent, if any, these are being used as ranking factors.

A session with ex-Googlers has confirmed that Google uses Chrome user data:

"Most of all though, and perhaps one of the biggest points of the session was that Google definitely uses Chrome user data and can track every click within it."

Any quirky/underrated ranking factors you'd like to share?

Image credit: AlaskaTeacher

  •  
  •  
  •  
  •  
  •  
  • Very intriguing, Carter. Particularly like Point 3. Might Even use it as an ad on the freelance sites I contract on...
    ...some day soon, quality copywriters will be commanding as much as SEOs - we can but live in hope???
    Cracking article and most enlightening, thank you.

    • Carter Bowles

      I sure hope so. When we look at things like Panda, we're most likely looking at evolutionary/swarm/neural algorithms that are trained on sets of "good" and "bad" content. All of this hopefully means that content itself will legitimately play a more important part in the years going forward.

      What counts as "good" content, on the other hand, is the creepier side of that coin.

  • Very intriguing, Carter. Particularly like Point 3. Might Even use it as an ad on the freelance sites I contract on...
    ...some day soon, quality copywriters will be commanding as much as SEOs - we can but live in hope???
    Cracking article and most enlightening, thank you.

    • Carter Bowles

      I sure hope so. When we look at things like Panda, we're most likely looking at evolutionary/swarm/neural algorithms that are trained on sets of "good" and "bad" content. All of this hopefully means that content itself will legitimately play a more important part in the years going forward.

      What counts as "good" content, on the other hand, is the creepier side of that coin.