Jeff Baker

“We need to get our blog’s bounce rate down”

“Our site traffic is down compared with last month. What should we do?”

“We need to use our target keywords more often”

If any variation of this drivel has come out of your mouth (or your vendor’s) in the last calendar year, come to my office.

“What a grouchy hardass,” you’re thinking.

And you’re not wrong! I have zero chill.

But here’s the thing … I’m right.

The honeymoon phase is over

Digital marketing is no longer a trendy “fad” that businesses want to be part of, just because it’s hot.

That means your impotent metrics no longer cut the mustard as “results” (I’m looking at you too, “Influencers”).

Rather, digital marketing is now a core piece of an entire business strategy, as much so as the sales department. If it fails, businesses will go under, and employees will get fired.

That means businesses can’t afford to throw good money at shitty ideas and insights.

So … if you have said or accepted any of the statements above, you have shitty ideas, or are being fed shitty ideas, and your ignorance is putting peoples’ livelihoods in jeopardy.

That means businesses can’t afford to throw good money at shitty ideas and insights.

This goes for consultants, in-house marketers or SEO/content marketing vendors.

In this article I’m going to outline common thinking traps, and how to shift your way of thinking into things that matter.

Thank you for coming to my Ted Talk. Proceed below.

    1. You’re treating engagement metrics like goals
    2. Falling in love with organic traffic
    3. Poor Date Comparisons
    4. Your Analytics Tools aren’t Accurate
    5. Creating straw-men comparisons

</0l>

1. You’re treating engagement metrics like goals

Nobody has ever made any money by lowering their bounce rate. Not. Ever.

The blog’s bounce rate is up 4% this month. We need to add some calls to action to increase ‘stickiness’ and get it back down.”

This is flawed thinking that has persisted for as long as I can remember. The way I figure it, it arose for a few reasons:

1. Old-school stigma: “Bounces,” or, people who only visit one page then immediately leave the site, have been stigmatized as “bad.” Similarly, “low” average session duration and pages per visit is associated with failure. The short-sighted thinking is that someone who doesn’t spend much time on your site didn’t find what they were looking for.

2. Analytics tools: Web analytics tools throw it in your face – it’s literally one of the first things you see:

Notice that each of these tools report different numbers? It’s because none of the tools are correct (more on that later).

3. Misunderstanding how content marketing works: Some pages are going to have high bounce rates. Like blogs. And that’s fine. You need to understand the intent of a person landing on different parts of a site. Setting a target for bounce rate across two completely different types of pages is, frankly, stupid. It would be more useful to compare similar pages to one another and look for anomalies.

Why it’s wrong to lean on engagement metrics

Engagement metrics, taken on their own and absent of commercial goals, mean nothing. Let me explain with two scenarios. Tell me which you like better:

Page 1: has a bounce rate of 86% and a Contact conversion rate of 2.3%.

Page 2: has a bounce rate of 53% and a Contact conversion rate of 0.7%

Which page is performing better?

Page 1 by leaps and bounds. Why? Because Contact conversions have a direct tie to revenue. Bounce rate does not.

Not convinced yet? Let me hit you with another scenario:

Page 1 landing page: results in a 2:13 average session duration.

Page 2 landing page: results in a 6:34 average session duration.

Which page is performing better?

Trick question. We don’t know yet.

Page 1 may fully satisfy the visitor. Page 2 may be the result of six minutes and thirty-four seconds of pure frustration and agony over your atrocious navigation. Absent of any goal that contributes to revenue, none of these metrics matter.

Lastly, it’s important to understand the intention of your visitors landing on specific sections of the site. For example:

      • Blogs: Will always have a high bounce rate. Is that bad? No, most people landing on a blog used a keyword that has an informational or “learning” intent. These people never had any intention of hanging around and checking out your product pages. They are going to read and bounce. Accept that.
      • Homepage: Should have a lower bounce rate than the blog, BUT, only if the conversion numbers aren’t up to par.
      • Landing pages: Same as the above.

What you SHOULD be looking at

Using engagement metrics to benchmark similar pages against one another.

See anything odd about these pages? Look a little more closely at this one:

That should stick out like a sore thumb. But here is the important part, we need to start with the conversion numbers and work our way backwards.

The poor bounce rate and time on site could be indicators about why the page is performing so poorly.

Why is it performing so poorly?

Who knows, but it needs to be investigated, because there is a good chance that this page isn’t delivering the same satisfaction to visitors as the other pages. That’s a problem.

It could be a UX problem. A content problem. An expectation problem. Or all of these things.

The important thing to note here is that the bounce rate is a tool and an indicator, not a measure of success. And we know this because we have set it against similar pages with similar expectations.


2. Falling in love with organic traffic

Too many digital marketers, and content marketers, treat organic traffic like it’s the only component of a digital marketing strategy. Especially when they are in the business of delivering organic traffic.

“Your organic traffic is up 6% month over month, that’s great news!”

Why is that great news?

      • Has it resulted in more conversions? Have you looked?
      • How qualified was the organic traffic?
      • Are you ranking for keywords from one single article that accounted for the entire 6% increase in organic traffic?

You get the point. Reporting a green number (increase) as though it’s in a vacuum, without further investigation, is a major mistake.

Why it’s wrong to report organic traffic growth as ROI

You’re going to see a theme here …

You forgot to look at the rest of the report. While traffic went up, you have no idea which pages caused the increase, nor do you know if the keywords driving traffic have any commercial value, or if they were mere accidents.

Frankly, I’d rather have less traffic and more conversions.

What you SHOULD be looking at

1. Trends in search visibility

I’ve got a tough pill for you to swallow: Organic traffic is not the “best type” of traffic. And you shouldn’t compare it to other traffic channels, as they are all different and serve unique purposes.

Organic traffic gives you a good sense of your site’s search visibility. In other words, how often is your content showing up in search results and driving traffic?

You’re looking for trends and patterns, which take months to fully develop due to natural fluctuations in search volume, seasonality, bots throwing off your numbers, cookie blockers, etc.

Start with Google Search Console data to get a sense of your search visibility. In Google Analytics, go to: Acquisition → Search Console → Landing pages.

Make sure to stretch out your timespan to include a minimum of 6 months of data. Remember, trends develop over time, anomalies and fluctuations happen weekly.

This report tells us a ton about our search exposure. For starters, we have a baseline of roughly 2.4 million impressions and 42,000 clicks per month. Now let’s take a look at trends:

From early-mid 2018 we grew by about 150%, then plateaued for about 9 months. But we only know that we are in a plateau because we have a long-view of the data. If I were to cherry-pick a time frame to compare, like the past three weeks compared with the previous three weeks, it would show a beautiful green number.

But that’s not very honest, is it? With a long view, we know this could easily be a blip.

2. Traffic performance

None of this matters one lick if the organic traffic you are driving isn’t doing what you want it to do. When it comes down to it, you’re in the business of creating revenue, not driving pointless traffic.

Add an advanced segment for organic traffic to your reporting, then go to Conversions → Goals → All Goals.

Compare the most recent six months with the previous year (we will talk about why we don’t compare period over period later).

There’s our baby right there, and it’s the only thing that matters when evaluating the increase in organic traffic. This proves that the increase in organic traffic has resulted in an increase in commercial value.


3. Poor Date Comparisons

Whether you are on the giving or receiving end of analytics reports, you need to keep an eye on the comparison dates used for reporting traffic.

Simply adding or subtracting a few days can turn that ugly red number into a beautiful green one, and you would never be the wiser.

Look what happens when I take a look at May 2019’s traffic versus April 2019’s traffic:

Beautiful, right? This is something I would want to report to a client.

Only … it’s completely misleading. I cherry-picked a month with high traffic compared with a relatively weak previous month.

Look what happens when I change the time period from April-May 2019 compared with February-March 2019.

That’s a more accurate reflection of what’s going on. Pretty sneaky, huh? And this is a very simple example of how I manipulated the data. I could use about a dozen other simple tricks that produces green numbers, and technically, it wouldn’t be lying.

Why it’s important to look at meaningful time periods

Traffic is affected by a ton of things that cause false positives left and right, if the data isn’t evaluated properly. Some of those things include:

      • Seasonality.
      • Weekends.
      • Random bots.
      • Anomalous spikes in traffic from any channel.
      • Unusual internal traffic.
      • Tracking blockers.
      • Cookie-ing issues.

What you SHOULD be looking at

1. You need enough data to normalize against anomalies.

As a rule of thumb, I never look at anything without a minimum of three months of data. Anything short of that will give you a very poor idea of actual trends.

2. You need to compare year-over-year

Traffic patterns will emerge, year-over-year. Most B2B sites will see a sharp decrease in traffic during the winter holidays, as most people are taking a vacation, and their budgets are locked in.

B2C sites will generally see an increase in traffic around the holidays.

Aside from holidays, your site will develop other unique traffic patterns for reasons you may never understand. And for that reason, you always need to compare year-over-year. This will ensure you are comparing apples to apples.

This time period meets the minimum qualifications.

And this is a report I can trust. I can say comfortably that traffic has increased about 50% over the past year, give or take about 10%.

Why give or take 10%? Because analytics tools are highly inaccurate.

Surprised? Let’s talk about it.


4. Your Analytics Tools aren’t Accurate

Whoa! Mind blowing, right?

What if I told you that you should allow for a 10% margin of error on your web analytics tool, and an 80-90% margin of error for your keyword tracking tools?

The fact of the matter is that these tools are only going to give you a general idea of trends, a vague sense of keyword positioning and a rough sense of how you match up

I don’t believe you, Jeff

Fair, I wouldn’t either. But let me show you something:

Here’s where Brafton ranks for “Content marketing agency” across various cities in the United States, according to Bright Local’s geographic search tool:

Chicago, 3rd position:

San Francisco: 1st position:

Austin, Texas, 2nd position:

Might not seem like much of a difference, but if I told someone I ranked #1 for this keyword, that wouldn’t be fully accurate, would it? Especially when you are projecting traffic based on positional click-through rate. Look at the likely differences in traffic based on position:

City Position CTR Traffic Difference
San Francisco 1 30% 150
Chicago 2 15% 75 -75
Austin 3 9% 45 -105

Big difference.

Okay, so it’s different by location, and you can specify that in some tools, you argue?

Sure, that works, if you want to isolate one city in the world. But what about your entire keyword ownership at a domain level?

I still don’t believe you, Jeff

Stubborn, that’s fine. Let’s take a look at how accurately our various keyword tools report on the total number of keywords our site ranks for in positions 1-50, in the United States:

Seranking.com: 2,373

Moz: 7,900

SpyFu: 8,612

SEMrush: 10,500

Ahrefs: 10,621

Search Console: 19,757

*Standard deviation: 5,662 keywords

So if you’re reporting one of these tools’ metrics to your boss or a client as though they are an accurate representation of your site’s search presence, you’re in for a rude awakening.

No two tools will ever align, nor will they be accurate.

What you SHOULD be looking at

These SEO tools are best thought of as relative comparison and trend tools. Meaning: How does my site roughly stack up against similar competitors, and which direction is my search presence trending?

1. Search presence relative to competitors

There is hardly a better way to get a quick-and-dirty understanding of your “share of the search” market versus your competitors.

These tools give you insights into one of three possible outcomes:

      1. We’re getting killed by our competitors.
      2. We’re about even with our competitors.
      3. We’re killing our competitors.

The data from SEMrush tells us that we are getting killed by Content Marketing Institute, we are about even with Newscred, and we are killing Brandpoint and Curata.

Fussing over differences in search presence 20% either direction is a waste of time, as proven above.

You’re looking for big differences.

2. Keyword targeting

Looking through your competitors’ ranking keywords gives you great insights into keyword ideas.

First, they will show you keywords your competitors rank for that you should be targeting. These are keyword opportunities.

Second, they will show you keywords that you are both competing for, and who is currently winning.

Example 1

Newscred ranks in position 1 for “Creative marketing agency,” whereas we don’t even land in the top 100 for this phrase!

For shame!

This keyword is central to our offering, and has high commercial intent. Clearly this opportunity was overlooked, and I wouldn’t have even known if it weren’t for this tool.

Example 2

Both Curata and Content Marketing Institute are beating us for “Content curation”, which gets 2,900 searches per month. I need to revisit this page and find out why we aren’t competing. Perhaps we need to re-optimize the piece.

It looks like it’s been steadily losing ground. Perhaps the content is outdated, or my competitors have written more comprehensive content on the topic. Either way, I’m getting beaten and I need to find out why.

3. Trends

You want to get a general idea of whether or not your domain is growing its search presence, or contracting. This will give you quick signals into a number of areas:

      1. Are your SEO efforts paying off?
      2. Have you been hit by an algorithm update?
      3. Has your growth plateaued?

This is a nice hockey stick-looking graph. Even if the numbers are off 20% either direction, I’m fairly confident we are in a growth phase.


5. Creating straw-men comparisons

Anytime you contrast your SEO work in a positive light against someone else’s work in a negative light, you look like an insecure jerk.

Also, it assumes they have the same objectives as you. Let me show you an example:

This slide is trying to convince you that visitors who view news content are more “engaged” than visitors who don’t. Pretty compelling metric, especially if you are the one creating the news content, right?

Wrong.

This slide is the epitome of irresponsible

Why it’s wrong to peacock.

1. You’re doing a blatant sleight of hand.

Let me show you how the magician pulls off this trick. By creating Advanced Segments in Google Analytics you can segment out specific visits to evaluate their behavior. Here is how I would create a segment to see how visitors who visited a blog page behaved:

Here is a segment of visitors who did not visit a blog page:

And the result, of course, is a favorable metric giving the impression that visitors love the content you created way more than the rest of the content on the site.

Here’s the problem: You created a scenario that you can’t lose.

By using the metric “Page Contains XYZ,” you are including all visits who saw a particular part of the site at some point. In our instance, 47% of all pageviews are to the blog. We are blog-heavy.

By creating a second segment of visitors who do not view a page at some point during their visit, you have removed over 47% of possible visitors who have definitely viewed at least 1 blog page, leaving you with only 53% of the pages that can be viewed by this new audience.

You’re comparing an audience who has the potential to view 100% of pages available with an audience who has the potential to view 53% of all pages available, in our instance.

And the numbers suck…

When you require any audience to have an engagement level with a portion of the site, your numbers will usually look better than the segment you create that isn’t allowed to engage with that portion of the site, especially if a huge chunk of your audience usually looks at that part of the site.

I can swap other sections of the site in these reports, and the required pageview audience will usually perform better than the non-required pageview audience. Let’s see what happens when I compare resource page visitors vs. non-resource page visitors:

This will never be an accurate representation of what’s going on. For a real understanding of how people interact with a site, you need to compare like metrics. More on that later.

2. You’re creating “sides”.

Whether you are comparing content you contribute to a site, or the organic traffic you create versus the PPC traffic provided by a vendor, you’re pitting your work versus their work.

The purpose of doing this serves one purpose: self preservation.

The problems with “self preservation reporting” is that it puts your interests ahead of the success of the website. It also prevents you from reporting on true insights, and pivoting away from strategies that are no longer performing (more on this later).

3. You’re comparing apples and oranges

Sites are comprised of different types of pages; blog articles, product pages, resource pages, the homepage, etc. Each of these pillars of the site serve unique purposes and, as such, should have different expectations.

Like pillars of the site, traffic sources also serve unique purposes …

      • PPC: Generally you want to drive commercial-intent traffic (people who want to buy things). You will want to see conversions from this audience.
      • Social: Fairly top-of-funnel, and you shouldn’t expect conversions.
      • Referral: Likely a combination of top, mid, and bottom-of-funnel traffic.

What you SHOULD be doing

1. Evaluate bulk sections of a site by landing pages

The phrase “landing page” is a bit confusing because we use it interchangeably for different purposes. Let me simplify this confusion:

Editorial landing pages: Any page you create that isn’t a blog article (generally).

Web analytics landing pages: Literally the first page a person sees when they land on your site, and their behavior afterwards.

Evaluating landing page metrics is one of the most insightful reports you can look at, because it does two important things:

      1. It clearly highlights underperforming similar pages when contrasted.
      2. It shows how visitors interact with your site after arriving via a particular page, which gives you insights into a visitor’s first impression.

Take a look at this landing page report of our product landing pages.

Let’s zoom in on this a little bit, I see something funny…

All of these pages serve the same purpose: to generate conversions. Period.

Because we are comparing similar pages with identical goals, we can safely make assumptions about pages that are performing and underperforming.

The /seo page: Generated 9 conversions over the time period.

The /content-marketing-services page: Generated 0 conversions over the time period.

We can draw that conclusion that our Content Marketing Services page is underperforming. For what reason I couldn’t say. But I now know that I need to investigate it.

2. Treating a site as an ecosystem rather than competing parts.

I’m not going to belabor this, but it’s worth saying. A site doesn’t work unless all of its component parts and the contributors who represent those parts collaborate and work together.

A siloed website is a failed website.

TL;DR

      1. Know what a commercially relevant conversion looks like. If you’re reporting anything you can’t tie back to revenue, you’re reporting on an indicator, not a goal.
      2. Organic traffic is not the Holy Grail of digital marketing.
      3. Poor time period comparisons are lies.
      4. SEO tools are not accurate, and shouldn’t be used to report on precise results.
      5. Self preservation is not a reporting style, it’s detrimental to the site.

Check in soon for part 2!