SEO Metrics that can Mislead – Transistor Digital Marketing

I’ve been in enough meetings with marketing teams to have experienced both good and bad reporting discussions countless times over. It’s pretty rare that the bad reports are generated with evil intent, but they usually stem from two problem areas:

If you’re in one of these meetings and are questioning the validity of the story being told about a campaign, there’s a number of metrics people like to use which are a strong indicator of one of those two issues. Here’s the most regular offenders and some questions you can ask to get some clarity about the numbers you’re seeing.

Engagement is a tricky one. It’s usually used in lieu of a hard goal (e.g. revenue) to connect as the result of a campaign. If it’s an awareness building effort, maybe the expectation of revenue is not super realistic. No worries. But engagement isn’t really a metric. It’s a term we use for a collection of different metrics. We’ve written about common engagement metrics in the past. They all have their place. But engagement by itself, with a number attached, has no uniform meaning. Facebook uses it as a cumulative measure of all actions people take with your ads (clicks, likes, etc.). A lot of agencies might use some formula to take various engagement metrics and combine them into a singular score.

When you see someone talking about engagement as a performance metric, the questions to ask are fairly simple.

Goal Completions

Goals, in the Google Analytics sense, can be virtually anything. Scrolling halfway down a page, clicking on buttons, visiting 3 or more pages, etc.. There’s even “smart goals” where nobody can actually list the actions which constitute a goal completion (Google uses machine learning to score your sessions and assign goal completions to the best ones. Clear as mud.). Tracking goal completions is a good thing! But you need to be aware of what makes up that number. Here’s the important things to question about all goal completions.

And in a lot of cases, goals may be very closely tied to revenue. E.g. a quote request form submission. In those instances, the question of quality is important. Is there a way to judge the quality of these submissions vs other channels? Note, it will probably come right back to you to work on answering that question. But it should still be asked.

Domain Authority

First off, domain authority can be a bit of a catch-all term. A lot of clever SEOs will use it as short-hand for the rank of the pages on your site, leading to an overall greater ability of any page of your site to rank. But Domain Authority is a metric invented by Moz. It’s essentially what I said previously, a cumulative rank metric. But they assign a 0-100 score and present it as a top line metric to aim for. That’s where it gets to be a problem.

It’s problematic in a number of ways.

First is that Google doesn’t assign rank (or “authority”) at the domain level. They do it at the page level. They have some ranking factors that apply to your domain name (e.g. exact match domains, domain age, etc.) but not to your domain in the sense of ranking all pages combined. That’s not the end of the world, but I do take issue with any SEO metric that is disconnected from the way Google assigns rank.

Second, it’s using a single input, links. I’m not crazy about one number metrics that try to weigh a bunch of different factors and combine them into a single score. But I get the intent there. This is used as a one number metric but it really is just one number. It uses a link graph to calculate the score. That’s missing a lot of other data.

Third, it’s totally external to your site. Related to the second point, if you’re going to use a single number to score your progress (you shouldn’t), it should be a number that incorporates some business/internal metrics so there is a direct link between the metric and your business.

When domain authority gets presented to you as a metric, proceed with caution. I’d start asking some hard questions here.

Here I’m talking about the number of keywords where you site ranks. Often this get thrown out in reporting like “You rank for 50 more keywords compared to last month.” Cool, now what?

Usually in reports we only show rankings for keywords we specifically are tracking. Which means they’re terms where we worked on pages and picked keywords to target/monitor. Sometimes we’ll include a full dump of keywords from Google Search Console, but that’s nowhere near a primary way we judge performance.

No, I’m not trying to just throw out a hot take here. Traffic is a perfect good metric. But with SEO in particular it can be greatly distorted. I’ve witnessed on no shortage of occasions where within 1-2 months of starting an SEO campaign, traffic growth gets thrown out as the primary measure of success. It’s totally possible you could see positive traffic movement in a month or two, but we’re gonna need some context here.

Basically, don’t let anyone just take credit for traffic in a vacuum. If you’ve worked with a team for a while and they support their top line numbers with additional data, regular effort to improve your visibility and traffic in search, then maybe it’s fine to just trust them when they boast about sessions/traffic. But early on, be weary.

For everything on this list (and probably any other reporting discussion) you of course need to examine whether any external factors influenced what’s happening. Did Google make some big change? Did a competitor go out of business or screw up SEO really bad? Did weather or news change behavior of people? Basic stuff, but things that can often get overlooked.

I’m intentionally not turning this around into metrics you should look at instead. Even the metrics above could be good in the right context. My mission is for us to just stop blindly accepting whatever numbers are thrown our way and start asking questions to hold marketers accountable for growing businesses.

Be the first to comment

Leave a Reply

Your email address will not be published.


*