Realistic SEO – Understanding Rank Potential

Realistic SEO - Understanding Rank Potential

Forrester Research projects U.S. consumers will spend $327 Billion online in 2016. With Google seeing approximately 7.1 Billion searches per day globally* (between mobile, desktop, and tablets) we can approximate that every single query in the U.S. is worth $1.26 (based on U.S. representing 10% of world’s internet users).

*Update: thank you Rand Fishkin for pointing out the original stat was outdated.

In a recent study by BrightEdge, SEO accounts for as much as 51% of the traffic being driven to B2B and B2C pages, re-enforcing that SEO is far from dead – and continues to offer a long-term, sustainable ROI channel.

So, why are we talking about SEO rank potential? Regardless of who you need to pitch SEO to : your boss, your client or business partner, it will all come down to one question: How much time and money will this cost me?

Prior to launching any SEO campaign, you need to know the resources needed to achieve your business goals. That way you can realistically answer the question.

It’s generally a bad idea to pitch an SEO campaign without first researching the keywords you’re going to be targeting, and understanding who you’re competing with… if you bet the farm on unattainable keywords, you’re gonna have a bad time.

So What is Rank Potential?

Rank potential is an analytical approach to understanding where a given webpage can actually rank in organic search, with respect to two axes of consideration; time and investment*.

*that’s my definition

It’s not realistic to project that you’re going to outrank a Wikipedia page for an informational query, or a Fortune 500 brand for their brand or branded product name – at least not without significant investment, if ever.

My approach is to analyze a page’s rank potential based on the qualitative SEO metrics for the current top 10 ranking URL’s (what I will refer to as the search engine results page 1, or SERP 1).

The metrics I analyze are:

  • Number of Links
  • Number of Linking Root Domains
  • Trust Flow
  • Citation Flow
  • Domain Authority
  • Page Authority

In addition to this core set I also will evaluate additional relative measures of authority including Domain Age, Organic Difficulty, Link Strength, Brand Footprint, and Social Media Impact.

Now I promise this is far from a beginner post, but to ensure you’re thinking of the base metrics the same way I do, I’m going to run through a quick and dirty description from my perspective.

My Perception of the Metrics

Number of Links

rank-potential_Total-backlinksMore importantly than just the pure number of links, this metric is used in conjunction with the number of unique linking root domains to determine diversity ratio. Organically authoritative websites have high link diversity ratios (LDR – links from many unique, authoritative root domains) versus gobs of links from the same 5 websites, likely owned by the same person.

In addition to the number of links as a consideration for LDR, it is also important to look at link velocity. If a site is picking up tons of links very quickly there should be an obvious cause such as press coverage, product launch, a new partnership, or a positive mention on a very large publication. If not, this is suspect that darker activities are afoot.

Number of Linking Root Domains

rank-potential_linking-root-domainsAs mentioned above, this is generally a sound measure of the organic authority of a website. Like everything else in SEO there are always exceptions to the rule, but generally websites that maintain an organic link profile should have anywhere from a 10 – 30% diversity ratio.

Which means 1 linking root domain for every 3.3 – 10 indexed links.

Trust Flow

rank-potential_trust-flowThis metric goes hand in hand with the latter metric, Citation Flow (CF), but for the purposes of this post I will try to describe here. This measure, like CF, was developed by Majestic, and is a relative measure of how trustworthy the link is.

Majestic has built a manually reviewed index of trusted websites and use a general rule they developed from a manual analysis of a representative sample of the URL’s included.

The rule is based on there finding that:

trustworthy sites tend to link to trustworthy neighbors – Dixon Jones, Founder of Majestic SEO

Trust Flow (TF) is a logarithmic scale from 0 to 100. In my experience you should stay away from links with a TF under 25.

Citation Flow

rank-potential_citation-flowCitation Flow is a predictive measure of how much influence a given URL is likely to pass on to links that it points to.

With specific respect to link juice, URL’s with higher CF are more likely to pass more of that influence downstream to the URL’s they link out to.

The practical application of this is links from pages with higher CF will send greater positive signaling than their weaker alternatives (generally below 15 is suspect).

Domain Authority

rank-potential_domain-authorityDomain Authority is an SEO KPI created by MOZ, and represents a relative measure of a domain’s authoritative value based on a logarithmic scale between 1-100. It is generally very accurate as a barometer for how *powerful* a domain is in consideration of that domain’s link profile and citation flow. One limiting factor is that it is calculated based on links contained within MOZ’s web index; Mozscape.

Page Authority

rank-potential_page-authorityThe child of Domain Authority, Page Authority, is the same basis scale as Domain Authority but instead of consideration for the authority of the entire domain, it is scored at the individual URL-level. It is a good indication of how powerful a specific URL is within a given domain, and is a great second-tier barometer for gauging the difficulty to outrank a page.

For Additional Consideration

Domain Age

This one is very obvious, it’s exactly what you think it is. How old is the domain – as in how long has it been registered for? What is the history of the domain and it’s indexed URL’s and links? How many times has it changed owners (WhoIs), IP’s, Nameservers?

The big search engines use some or all of these metrics as trust indicators. In addition, it is speculated that websites less than ~6 months old (think freshly registered domains) are likely to experience what is often referred to as Google Jail if they try to acquire links to quickly.

While Cutts has hinted that domain age may not be a big factor, older more established and more trusted domains are going to have an advantage when it comes to ranking.

Brand Footprint

What I am specifically talking about here is a general sentiment analysis that can be quickly and manually run for a website’s brand name. In my experience certain search verticals use variations of Google’s ranking algorithms to serve and rank different kinds of results.

In my very humble opinion this is why you will sometimes see search results with a lot of rich snippets, or packets of video results, and even sometimes results from review or complaint websites. In these instances it is important to consider the diversity of the results and think about the experience that G is trying to provide.

If a brand (or Entity) has swaths of negative press around it, from specific kinds of review websites, regardless os how weak that URL may be – it may be harder to unhinge and outrank. I believe this also has a lot to do with the idea behind time to long click, or G’s projected measure for quality / user satisfaction.

Social Media Impact

I’ve saved the most variable metric for last – which is also the hardest to define. What I’m looking at here more than anything else is what do the social properties for this brand look like; how often do they update them? How popular are they? Do they own at least 80% of their brand SERP?

If not, what other kinds of websites are ranking? Are there competitor sites in their brand SERP? (that’s generally a good sign – it means that G does not yet see them as a more established Entity for that keyword).

Getting Realistic with SEO

What better way to crush the dreams of all aspiring SEO’s out there then to break down just how realistic (also read: expensive) it would be to rank on some of the most coveted enterprise SERP’s.

For this I’m going to analyze SERP1 for keywords in the following 3 verticals:

  1. credit cards
  2. used cars
  3. home loans

For each of these verticals I’m going to run a keyword discovery report, select the 3 keywords with the highest search volume (which is not necessarily the seed keyword), and then analyze the rank potential of the SERP’s based on the metrics I listed above.

It’s going to be a relatively rough breakdown but my goal is to illustrate the time/money/resources you need to invest if you’re going to crack big money SEO.

Ready to get realistic with your SEO? Drop me a line ›

The Keyword Discovery Process

The fastest way to generate solid keyword ideas while getting all the important metrics you need, is to use a tool that does all the heavy lifting for you.

Fire up what ever your keyword tool of choice is, for this post I’ll be using Term Explorer, mostly because it gives me practically unlimited relevant suggestions (up to 90,000) but it also provides all the search volume and competitive data I need to get started.

*Update: Term Explorer has published this killer process for how to prioritize your keywords

Using each keyword as a base seed, I’m going to run 3 small keyword jobs and get a quick 1,000 related keywords with most of the directional data I need:

Keyword Discovery Results for Credit Cards (click to enlarge)

CreditCards-kw-discovery

Keyword Discovery Results for Used Cars (click to enlarge)

UsedCars-kw-discovery

Keyword Discovery Results for Home Loans (click to enlarge)

HomeLoans-kw-discovery

Time to Select Our Analysis Pool

Based on the above results I’ve selected the following 9 SERP’s to review for rank potential. I’m going to dissect one of the most coveted terms in search (credit cards) and then select one term from each of the other keyword sets to analyze.

You can click each one of the keywords below to access all the SERP-specific data for each term – I’ve made it publicly available using Term Explorer’s SERP share functionality.

Credit Cards

  • credit cards
  • credit cards for bad credit
  • best credit cards

Used Cars

  • used cars
  • used cars for sale
  • used car dealerships

Home Loans

  • mortgage rates
  • mortgage payment calculator
  • fha loan

All SERP screenshots were taken on 3/15/15 using Google Chrome, Logged Out in an Incognito Window.

SERP’s for “Credit Cards” Keywords

I’ve included SERP screenshots for each of the enterprise keywords I’ve chosen for this post.

Even though I’m not going to do a full analysis for all 9 keywords, based on some interesting nuances I observed across these SERP’s while selecting terms for analysis, I wanted to include all of the screenshots for reference – I’ll explain more on this later.

“credit cards”

credit cards - Google Search

“credit cards for bad credit”

credit cards for bad credit - Google Search

“best credit cards”

best credit cards - Google Search

SERP’s for “Used Cars” Keywords

“used cars”

used cars - Google Search

“used cars for sale”

used cars for sale - Google Search

“used car dealerships”

used car dealerships - Google Search

SERP’s for “Home Loans” Keywords

“mortgage rates”

mortgage rates - Google Search

“mortgage payment calculator”

mortgage payment calculator - Google Search

“fha loan”

fha loan - Google Search

A Quick Observation

In case you didn’t notice, the SERP’s with “Google News” results only show 9 results outside of the news interface element. Even when a local search pack is rendered, if there’s no new results there are still 10 URL’s shown.

Let’s Analyze Rank Potential

I’m going to approach this 2 ways:

  1. The maximum potential search rank I believe can be achieved and at what cost
  2. What is going to be required to achieve that ranking, including acquisition costs of domain vs. website, links, and content.

Here are the top 10 ranking URL’s for our first credit card keyword (click to enlarge):

SERP1_credit-cards

 

and here is the same results when I highlight them competitively from green to red, i.e. green is GOOD for us and red is BAD, relatively speaking (click to enlarge).

SERP1_credit-cards_analyzer

Breaking this Down

First and foremost, it’s important to understand that the analysis we’re doing here is based on all metrics at this instantaneous point in time (IPT). This is a particularly relevant metric to consider when looking at enterprise SERP’s as they tend to change frequently, not necessarily in terms of rankings, but in terms of qualitative metrics, i.e. these pages will likely continue to acquire more links…

Be Sure To Consider Velocity

As further reenforcement to my note above, to get a better sense of which of these URL’s is building there arsenal versus potentially slowing down (or leveling out) we will want to look at the velocity they are acquiring new links.

For this I personally pay closer attention to number of new linking root domains a site is adding versus just number of links from pages. So in the case of creditcards.com’s link profile (pictured below) we see they’ve recently been adding crap tons of new links, but the trajectory of new linking root domains is a very different angle.

link-profile_creditcardscom

They went from 9,567 LRD’s on March 1st to 9,273 on March 31st, so a net loss of -294. Let’s compare this to the chase.com sub-domain that’s ranking #2:

link-profile_chasecom

creditcards.chase.com went from 1,511 linking domains on March 1st to 1,443 by March 31st, so another net loss, but only -68. However, in terms of magnitude creditcards.com lost 3% of LRD’s versus chase.com which lost closer to 5%.

So at first glance chase.com’s link profile from a diversity ratio looks healthier; 3.6% versus creditcards.com’s 1.1% – but chase is shedding LRD’s at a faster rate.

It’s also worth noting proper velocity analysis should be done over at least trailing 12 months, if not longer, and not just the most recent full calendar month – this was just done to show an example of link relativity for the purpose of this post.

Circling Back to SEO Metrics

So I’ve made my point about velocity, now I want to return to the Excel screencap above.

What we’re looking for is horizontal bands of green, hence the color coding to assist with visual identification.

So the first row that jumps out at me is #5, creditkarma.com with the following SERP metrics:

Outbound Links: 0
Relevancy Score: 6
Page Links: 391
Page PageRank: 4
Page Authority: 32
Domain Age: 9.5 years
Domain PageRank: 5
Domain Authority: 69
Domain Links: 2,193,824
Trust Score: 10
Link Strength: 10
Difficulty Score: 9.99
Trust Flow: 16
Citation Flow: 25

So as a whole, and without context, this is a pretty scary set of SEO metrics to compete with out of the gate. But as is the case with everything in life, context is king – and given these metrics are the foundation behind the URL ranking #5 for the term credit cards, this actually isn’t so bad.

The scariest numbers are the DA at 69, the trust score at 10, and the domain links at 2.1 million. But on the flip-side of the scary coin is the opportunity coin – and the metrics that actually pretty exciting; page links under 400, domain age under 10 years (on a SERP where the average domain age is ~15 years), and a page authority under 40.

Backing in Cost to Rank

This is where the process becomes subjective, relative to the SEO and their available resources for ranking.

My process is based on computing the value of the projected traffic using my keyword valuation model, however, I don’t spend my days competing in enterprise SERP’s – so I asked for some insight from some SEO’s who do.

Here’s how Ian Howells, Owner of Hustle Savvy, analyzes rank potential:

 

 

 

 

 

Running Ian’s Model

So taking Ian’s quantitative model into consideration, let’s use some conservative revenue projections and calculate the rank potential and cost to rank for our next enterprise keyword from the “home loans” set; mortgage rates.

For the purposes of my spin on the valuation model I’m going to look only at SERP 1 (whereas Ian looks at SERP’s 1 and 2). Click to enlarge the image – sorry it’s tough with these super wide screenshots.

rank-potential_mortgage-rates

Here’s a rundown of approximately how many links each URL is adding or losing (net) per month (based on available data from Ahrefs including both recrawled and dropped links):

  1. Zillow.com = -30,000
  2. BankRate.com = -600
  3. WellsFargo.com = -1,100
  4. MortgageNewsDaily.com = -4,000
  5. QuickenLoans.com = -23
  6. MLCalc.com = -7,500
  7. LendingTree.com = -5,000 (note major swings from 1-2k per day)
  8. HSH.com = +1,100
  9. Topics.Bloomberg.com = -25

So it seems almost all of these are losing a net amount of keyword’s each month…

Well that was unexpected.

Doing The Math

I started writing this post a long time ago, like sometime over the summer in 2014. So I’ve had to update the SERP metrics more than I’d like to admit. So I’m going to update them one more time right now. The SERP has changed again since the last time I grabbed the information above – which was almost a month ago.

So as of the writing of this sentence, it’s April 13, 2015 and the link counts for the above URL set are different. As it stands right now the lowest LRD count in this SERP is the lendingtree.com page with 83 linking domains (not considering the domain seems to be losing thousands per month).

If we use Ian’s cost projections (which I think are pretty reasonable) we have a base link requirement of 83 links multiplied by $300 per link = $24,900 to crack SERP 1 for this term (in pure link cost).

Starting to put things into perspective?

Big SEO is expensive. Now let’s see if it’s worth it…

First we need to get a rough approximation of what a mortgage broker is making on these leads. Moneysense.ca estimates a mortgage broker makes ~$2,250 on a $300,000 mortgage, so let’s say that lead is worth 10% of that give or take, so $225*.

*I have have never worked in the mortgage space, if you have a better measure please share in comments and I’ll update here.

I’m going to figure since it’s a pretty qualified search we can assume a lead response rate of 3%. So with the head term having a monthly search volume of 246,000, and the first set of results in the discovery run screenshot coming in conservatively around 2,000,000 searches/month…

We’re looking at conservatively 6% of 2,000,000 searches making it to that page per month, so ~60,000 qualified visits, 3% of whom are going to convert as leads = 1,800 new leads. If only 10% of those leads convert into mortgages you’re making $40,500.00 per month (180 x $225), but let’s dial this way back to more conservative numbers.

If an acceptable break-even period is 12 months, that means you would need to make $2,083/mo – which based on this population size would mean getting just over 1% of people searching for just the head term to your website.

If you can even make it to page 2 you have a good chance of doing that…

1% of searches for “mortgage rates” is 2,460 visitors per month, with a 3% response rate that’s 73 leads, converting 10% into customers and you’re already at $1,575/mo (7 new customers per month), and your payback period shifts to 18 months – still within Ian’s model.

My Rank Potential Model

I look at this just a *bit* differently, in that my model is even a little more quick and dirty.

I like to see the SERP data as a whole (in Excel), identify the row that stands out as the weakest link, put that in context of the other ranking pages, and back into what it would cost to get there.

So moving onto a keyword from the last enterprise set, I’m going to look at the SERP for “used cars.”

Google-SERP_used-cars

Do you see what I see?

#5, http://www.carsales.com.au/new/, has lots of green in the row… including one major flag, no link flow scores.

Upon further inspection this page is returning a 404 error AND when I check this SERP today, is no longer ranking, which I’ll come back to in a bit.

But, what this means is these metrics were enough to get this page to this point…ranking in the top 5 for a head keyword that receives 450,000 searches per month.

Now if I was going to go after this SERP specifically, with the hopes of getting a page to #5 – here’s what I would consider my costs to be:

Domain

The ranking domain is only 5 years old, which is great, it’s not an EMD, and has a domain authority of 68 and domain PageRank of 6. I should be able to pick up a comparable domain in the $3,000 – 5,000 range, however – because this is in the auto vertical, it’s likely I would need to spend a little bit more, so let’s call it $10,000.

Development and Content

So from a quick glance it looks like this site is huge, to the tune of ~40M pages. This is definitely bad news, but the good news is these pages are all database driven, and this content can be scraped from any of the other car listing sites. Still, this is going to represent a major cost for development – let’s say $50,000 for the website and content.

Links

It gets harder still, the domain link profile for this site is serious – with over 30 million links. Good news is the page we’re looking to rank only has 1,602.

Thinking Through This

Without some very compelling reason to get into the used car space, there’s no way I would build this website. But depending on the projected returns – I may try to rank a page for only the head term set of keywords – and sell the leads.

In that case I would chop up my budget into $10,000 for the domain, $5,000 for content, and $5,000 for links. At the $10,000 price-point I would be specifically looking for a domain with a DA60+ and an existing base of links so my link budget could be dedicated to new high-end links with laser relevancy to used car topics.

At that budget I’m fairly confident that I would, at the very least, be able to crack page 1 and if need be sell the website to recover the investment capital (if the average price per lead turned out to be too low).

Key Takeaways

Spot checking enterprise SERP’s has been something I’ve enjoyed more and more each year. What continues to impress me is how often these SERP’s see complete flux.

For example when I was checking the link numbers above, I re-crawled the “used cars” SERP to find that even logged out incognito, I was still being served some local results based on my ip address, and it was a 3-pack from Craigslist. Craigslist is so hyper-local that it doesn’t even show up on the global SERP.

What’s more, every one of the SERP screenshots taken above on 3/15/15 is different as of 4/13/15. This is fantastic news to SEO’s as a SERP in-flux represents an opportunity. Another good caveat is the above SERP’s are not crowded by only a domains, wikipedia, or answer boxes.

So in light that these are all highly commercial keywords with monetization intent, these all represent SERP’s with potential. Before understanding rank potential, you need to see if you really deserve a Google first page rank.

Get your own rank potential analysis right now! Contact us for a quote!

In Closing

Earlier in the post I said I would explain why I was leaving up all 9 SERP screenshots despite the fact that I was only going to analyze 3, the reason is I want to know what you think.

What do you notice based on the signaling UI’s that are rendering? If you decide to pull any fresh link data – please leave a comment and I will update the post to include any new findings.

Thank you.

About Skype

Check Also

, Google officially drops Mobile Usability Report, #Bizwhiznetwork.com Innovation ΛI

Google officially drops Mobile Usability Report

Search Engine Land » SEO » Google officially drops Mobile Usability report, Mobile-Friendly Test tool …

Leave a Reply

Your email address will not be published. Required fields are marked *

Bizwhiznetwork Consultation