Showing posts with label analytics. Show all posts
Showing posts with label analytics. Show all posts

Plan Do Check Act, SMART Goals and A/B Split Testing

Split testing is a valuable part of strategic keyword management. For a start, it allows you to throw away those keywords that aren't bringing in the results -- but at the same time, it gives you an opportunity to improve on so many areas of your retail marketing effort that learning it is a transferable skill in itself.

Of course, this article looks at split testing from the point of view of keyword management, but it can be, and has been, applied to all kinds of result-dependent activity.

By result-dependent we mean an activity that has a clearly defined and measurable goal; one which produces results that further our guiding strategy.

In strategic management terms, these are results that keep us in line with our mission and contribute to a sustainable competitive advantage.

At the core is something that I may well have alluded to before: PDCA.

Plan Do Check Act


The PDCA cycle (also known as Deming's Cycle) is something that I first came across in the context of process improvement. For a great overview, try the US Navy's Handbook for Basic Process Improvement, which is part of their Strategic Planning collection of documents.

(It's pretty lengthy, so don't get sidetracked, but there's a lot of useful flowcharts in there that can be easily applied to outsourced projects i.e. manual keyword expansion from a root keyword phrase, and checking of vital keyword effectiveness metrics against baseline measurements, and so on.)

The point of PDCA is that you first make a plan. Then, you carry out a process that has been created to achieve the goal set out in the plan.

Next comes a period of checking -- were the expected results achieved? -- and adjusting the process (the Do) part so that there is a higher chance of achieving the goals set out in the Plan.

In a nutshell, that's the Deming cycle, and it can be applied to a lot of different activities. Such as: A/B split testing of keyword rich headlines that form part of an advert campaign (on AdSense, for example).

However, before we look at A/B split testing (the process, or 'Do' part), we need to make sure that we understand what it is supposed to achieve. For that, I like to use SMART goals.

SMART Goals


There's a great discussion of the various meanings behind the SMART moniker on the Project Smart web site. While you're free to choose a set that make sense to you, here's what I recommend:

  • S : specific (i.e. not at all vague)
  • M : measurable (i.e. numerically)
  • A : achievable (i.e. not unrealistic)
  • R : results-oriented (i.e. has to take you to where you want to go)
  • T : time-bound (i.e. set a time and/or cost limit)

In this case, your goal might be to increase click-through rates by 1% within 6 months by varying keyword placement in an advert heading.

The goal is set, but what is the process that will deliver the desired result?

A/B Split Testing


The driving principle behind A/B split testing is that you have two test cases, and you want to see which one performs the best. When you find that out, you devise another case and pitch it against the current best performer.

It sounds simple... and it is.

Within our PDCA process, it's the 'Do' part:

  • Plan : "increase click-through rates by 1% within 6 months by varying keyword placement in an advert heading";
  • Do : execute split testing for up to 6 months (or when CTRs increase by 1%)
  • Check : did we achieve the goal?
  • Act : if not, how can we improve/change the process?
The point is that Google AdWords will present you with hundreds of keyword phrases, along with data to help you pick the best one. But, it's the best one from Google's point of view, or at a push the best one that's identified using anonymised data.

It might not be the best one for you.

By split testing across candidate keywords, you can easily figure out which gives you the best return on your investment. Here, we've only put the goal in terms of CTR, but it could equally have been oriented around the eventual action of the visitor.

I've also talked about varying keyword phrases, but what about varying the words around those phrases? A great source of variations can be found in a free eBook "87 Marketing Secrets of the Written Word" by the legendary Ted Nicholas (link goes to his main site, and isn't an affiliate link.)

By picking carefully and pitching different combinations of words against each other, you will quickly build up a reliable list of variations for your chosen niche, as long as you follow PDCA, SMART and A/B Split Testing principles.

Tuesday, 23 August 2016

On Tools & Conversion Rates: Notes from the Trenches of Strategic Keyword Management

Today, you get a twofer: as in, two for one.

Firstly, an update on this post about tools being put beyond use. Now, I'm not going to name names, but two of the tools I pay for and use regularly have been put beyond use (albeit temporarily) thanks to an AdWords change in the Keyword Planner.

Remember that I postulated that this was, at least, one tool that wouldn't be changed in a hurry?

It turns out I was partially wrong. In their recent article "Google says bots are the main target of Keyword Planner changes", Search Engine Land's Ginny Marvin says the following:

"The reality is that Google’s keyword research tools were designed to help advertisers develop their search campaigns. The availability of the External Keyword Tool for years, however, set expectations that it should be open and available to all."

For those who are unaware of the pending changes -- I say pending, because not everyone will be affected straight away, and it's likely that different accounts will be affected in different ways  -- they centre around restricting data to accounts that have been designated as research-only.

In other words, if you don't spend with AdWords, then you don't get the data.

Economically, with my academic hat on, it makes sense. Strategically, though, it's a bit at odds with Alphabet's Don't be Evil mantra, and the SEO / KWR part of me can't help but think that limiting data that helps content creators research what people are looking for, and willing to pay for, is only going to result in content blanket bombing and post-publication keyword research.

In other words, we're going to go back to speculative content spinning around known search terms, just to see where the content gets placed in Google's index (via the Search Console and Analytics reports) rather than pro-active keyword research.

While reactive keyword research -- as I call it -- certainly has its place in your keyword research toolbox, what we don't want is a sudden glut of spun content designed to test the viability of keywords, and provide no real value.

Meanwhile, the tools I mentioned? They rely on API access, and seem to be being throttled, either by the tool's creator, or by the API limits, such that they have ceased working. I'm sure a fix is in the works, and I have standby techniques that, while a bit more long winded, get me to the same place.

A case of do as I say, and as I do, for once!

Meanwhile on a more positive note...

There's a great graph on Jason Tabeling's Search Engine Watch article "Online-to-offline search value is exploding with 'near me' searches"that illustrates something that I've been pushing out to retail clients for a while now.

In a recent post "How Keyword Research can help High Street Retailers", I pointed out that many people are now comparing retailers on the high street whilst actually being on the high street with mobile devices in hand.

The anecdotal evidence is borne out by Tabeling's research, and it turns out that "near me" searches are pretty likely to be conducted on a mobile device, which is interesting.

More interesting from a retailer's point of view is his assertion that advertisers "are paying 30% more for 'near me' searches compared to searches with other terms, even though our data showed literally ZERO conversions for 'near me' terms."

Readers of my "Cheat Sheet" eBook will know that part of the strategic management of keywords pivots around PPC campaigns, and looking for the money in the market (think Search Volume x Cost Per Click).

Tabeling's research shows that 'near me' searches attract a very high CPC, have solid volume, but don't get the clicks. In other words, advertisers are spending for the sole purpose of exposure -- probably with their physical address -- near the top of the SERPs, or in AdSense side-bars.

He suggests that it's time for all advertisers -- and I would say, clicks'n'bricks retailers especially -- to review how they track conversion rates for so-called online-to-offline traffic, as in this new digital advertising reality, conversion "might not be from the traditional online search, but from a find a store visit or a click-to-call action."

Measuring the ROI might not be the simplest part of strategic keyword management, but as in other aspects of retail, it is vital to know how much you are getting back for the spend that is being forced upon you.

Friday, 19 August 2016

What if All Your Keyword Research Tools Disappeared Overnight?

How To Avoid This Often Terminal SEO Mistake

Recently I had a client whose favourite -- indeed only -- keyword research tool had been put beyond use.

Behind the initial panic was the growing realisation that they didn’t actually know how the tool did what it did. It was selecting excellent keywords, and had served her well in the past, but she had no idea how it worked under the hood.

To be clear on this point: “put beyond use” can mean any one of the following:

  • A temporary glitch;
  • The provider went out of business;
  • The fees went up beyond your capacity to pay them;
  • The source of the data that the provider was using was put beyond use.

The first is easy: it’s a waiting game. Just make sure that you are always a few days (or weeks) ahead of the data point. In other words, don’t leave it until the last minute to have a basket of keywords to work with: use seasonal and outlier analysis to keep ahead of the game!

The rest of the items in the above list are catastrophic if you have no idea how to replicate the service manually.

So, here’s the first lesson: don’t use metrics provided by third parties that you can’t replicate manually. Any services you use should merely be time-saving conveniences; if they have special sauce, one day you’ll be left with only a dry bowl of pasta.

You’ll still eat, but it won’t be pleasant...

Make Sure You Have An SEO Plan B


Even if the absolute worst case scenario is manual calculation using the most basic tools, it’s better than nothing. With the right preparation, it can be outsourced, releasing you to perform business functions that add value to your proposition.

For every SEO action -- from keyword research to social posting and link building -- make sure you have an alternative to the time-saving tools you use on a daily basis. There are some that you can’t do without (Google Search, AdWords, and the actual social networks themselves, for example) but make sure that these are:

  • In the minority;
  • Unlikely to be put beyond use.

In the keyword research field, we often deploy third party tools that gather up data and then present it in a way that lets us draw conclusions. For example, the AdWords Keyword Planner, or SEMRush analysis tools pull together a number of data points -- some proprietary, some not -- as a way to save time.

Decision tools such as Market Samurai, and automation tools like KeywordTool.io and Power Suggest Pro*, also fall into the category of must-haves for people embarking on serious strategic keyword management projects.

As long as these are used as labour saving devices, and not crutches, that’s okay. In fact, although I used -- and have paid -- for some of the above, I never use them for anything except baseline research: just the numbers.

For decision-making, I prefer to rely on my own metrics, which sometimes includes charting the data and just using my eyes. However, I do have three so-called ‘first cut’ metrics that I’ve rolled together to help make sure I weed out the non-starters.

These are automated, but could be done manually, if my toys get taken away from me!

Create Your Own Keyword Research (and SEO) Metrics


Probably the most important metrics are those I call ‘first cut’. The idea is to cut away a first set of keywords that aren’t likely to be useful. The metric that you use tends to be one of the following:

  • Pure volume - search volume vs. results returned;
  • Market value - amount of free advertising capital in the niche;
  • PPC Campaign - reducing the cost of a PPC campaign for maximum exposure.

Obviously, to calculate these, you will need tools. For example, my own go-to metric for a pure volume project is to emphasis the addressable market over the sheer number of searches: sometimes referred to as the KEI.

To calculate the KEI, you will at least need two numbers:

  • Search Volume
  • Number of Results

The second of these is easy enough to scrape from a Google Search (either manually, or with a simple script) but the Search Volume is only really available via the AdWords Keyword Planner.

The same is true for the market value metric, where the anticipated cost per click is used to test a keyword for available advertising spend against search performance, and picking outliers for inclusion.

Again, Keyword Planner is the only real source; luckily it is unlikely to be put beyond use for the foreseeable future, as it supports most of the Google keyword economy.

That’s good, because the last first cut metric, specifically used to remove expensive keywords, is almost the opposite of the market value metric, and specifically uses data from Keyword Planner.

Tools can help apply all of these metrics: for example, Market Samurai can be set up to filter for commerciality, or cheap keywords, and remove everything else. However, if you don’t know how to do the work manually, and those tools are taken away, you’ll be unable to replicate the results and you’ll have to start from scratch.

Learning how to calculate first cut metrics isn’t a good thing to have to do in a hurry. The best advice is to start now, and roll your own set of metrics: not only will you have a safety net, but you'll also learn a lot in the process. I guarantee it!

-----------------------------------------------------------------------------
* Disclosure: I participate in the Power Suggest Pro Affiliate Scheme, and the article link is an affiliate link. If you prefer not to click it, use www.powersuggestpro.com instead.

Tuesday, 9 August 2016

Zero Traffic Keyword Research Technique To Maximise Results

Recently, I had a client who was panicking over a new blog. For affiliates, who make their money selling other people’s products, using their own content to create the opportunity to make a sale and build a list, a lack of traffic is somewhat disturbing.

However, just because a page has no hits, it doesn’t mean that it hasn’t been seen.

In fact, it’s likely to have been seen by the one entity that could make a difference: the Googlebot. Not only that, but if it has been indexed, the message is probably getting in front of an audience -- even a limited one -- something which can come as a surprise for those new to keyword research and search engine optimisation.

Search Console: Your Personal SEO Keyword Search Tool


To help website owners, Google used to have a tool known as the GWT (Google Webmaster Tools) which linked Analytics to Search. Since GWT was a bit of a misnomer, Google renamed it to Search Console.

It does what it says on the tin: it is a website owner's view on search, from Google’s point of view.

Take a look at the following screenshot:



At first sight, it’s a bit worrying. There are clearly no page views, which means that there will be a CTR (click through ratio) of 0%. Usually, SC is used to track the effectiveness of keywords in terms of that ratio. Without it, clients start to panic.

However, the Search Console also reveals some other interesting statistics which are a veritable goldmine for someone who helps webmasters boost their traffic on the back of strategic keyword management.

Let’s add a few more bits of information to the table:



Now, we can see that Google has not only indexed the pages on the site, but also ranked them, and displayed the listings to potential customers. Okay, so they are well down in the rankings (5.5 - 42.0 isn’t a great result, by any stretch), but at least the following is being seen in the results:

  • The title;
  • The meta data;
  • The summary content.

What’s more, SC is also telling us for which specific keywords the Googlebot has indexed the content. Now, the question is -- are these the keywords that represent the content accurately?

If you conduct outlier analysis on a scatter graph of CTR against SERP (Position or Rank), you’ll see, for a crowded list of keywords, that in the upper right quadrant, where the CTR is high, and where results are well down in the SERPs, some keywords that just pull in clicks:



The analysis of this behaviour is that there are some parts of any market that will not only trawl through ten pages of results, but also still click on them if they meet their expectations.

This is the so-called “long tail traffic”, and understanding that where today you get impressions, tomorrow you could be getting clicks can be something of a revelation.

The First Keyword Research Strategy You'll Need!


In fact, central to one of the traffic improvement modules that are used to boost results is known as Zero Traffic Keyword Fishing.

The core idea is that you can create content that has no other purpose than to get Google, via the Search Console, to suggest keywords that have audience outlier properties. To do this, all you have to do is continue to create content, and let Google index it.

Oh, and stop being obsessed with traffic...

Where the traffic improvement comes in is in leveraging something that Ted Nicholas calls “Magic Words” to improve the listing, and then checking to see if the CTR goes up. The CTR is a ratio of impressions (how many times the listing is displayed) against clicking through to your web site.

Remember that we noted that an indexed page has at least the Title and a Summary (from the Metadata) displayed in the SERPs. You can get a view of that content through SC, using the Search Appearance from the main menu.

Specifically, under HTML Improvements, Search Console can point to exactly where modifications should be made to improve performance. After all, Google wants to get your valuable information into the hands of people who will benefit from it, as much as you do!

By making sure that you get magic words* into the text you will have a material effect on the CTR. Beyond that, if you then tweak the page so that it is more relevant, and deploy social media sharing and backlink strategies you will improve SERPs, get more impressions, and, eventually, more clicks.

Just remember to play the long game: content only remains relevant for as long as it delivers value to the target audience.

-------------------------------------------------------------------------------------

*Ted Nicholas has lists of words that improve headline performance in his books, and it would be unprofessional of me to list them all here; however, three obvious ones are: "free", "how to" and, of course, "you".

Saturday, 30 July 2016

Stop Swimming With The Sharks and Bask In The Blue Ocean of Keyword Research

In strategic management, one of the concepts we come across quite regularly is the difference between the Red and Blue Oceans. The Red Ocean is the place that we envisage most competition to take place, where conditions are tough:

  • the marketplace, and rules of the game are well known;
  • there is high competition for demand;
  • the trade-off is low margins;
  • differentiation is through price. 

(source: Blue Ocean Strategy web site)

The Blue Ocean, on the other hand, is the polar opposite of all of the above - as the authors of the Blue Ocean Strategy book (W. Chan Kim & Renee Mauborgne) put it: the aim is to "Create uncontested market space" in which to do business.

Most people in keyword research and SEO are swimming with the sharks in the Red Ocean.

It stands to reason: they're using the same techniques, the same data, and leveraging the same tools. And there's nothing intrinsically wrong with any of that, except that two people in the same niche, doing pretty much the same analysis are going to naturally put themselves in direct competition.

So, the first message is an easy one: develop your own unique way of deriving value from the data.

This is the point of an area of study known as Data Science. I recently read an excellent free eBook on the topic by Jerry Overton (a professional Data Scientist) -- Going Pro in Data Science -- in which the author explained how the key to success in data science was in picking signals out of the noise.

Looking for patterns, in other words.

Traditional SEO practitioners spend a lot of time telling is we need to look for patterns in order to chase the market. It makes sense on the face of it -- go to where the market is, and satisfy its needs -- but there's a slight problem.

Online marketing is getting so saturated that the low hanging fruit in your niche may well have already been picked. Or worse still, it's become a red ocean of marketers all competing for the same space.

Retailers suffer a similar consequence; they have to compete with each other, and online retailers. Even those in retail service -- restaurants, beauty salons and so forth -- will find that they need to adopt their own data science strategies to pick their market's minds, and develop strategies to exploit their competitive advantage.

In fact, strategic management also tells us that there are resources that contribute to our competitive advantage that must be valuable, rare, difficult to imitate and organised in such a way as to make exploitation easy (VRIO analysis).

Keyword research falls into such a category, but only if you develop your own data science to reveal the answers that you are looking for. After all, there's nothing rare or difficult to imitate about looking simply for the keyword phrase with the highest CTR, or the most search volume, or even with the least competition for space in the search engines indexes.

Everyone is looking for those keywords, so even if the space is uncontested now, it's unlikely to remain that way for long!

Luckily, most modern tools allow you to download your research data, so that different data points can be combined in a spreadsheet, and algorithms developed that let you organise, sort, and prioritise keywords according to your needs.

For example, you might download data from Google's Search Console (such as CTR and SERP data per query), and merge that in a spreadsheet with search volume data from the Keyword Planner, to enable you to perform analysis to reveal likely candidates for a new campaign centred around a keyword with certain distinct properties.

Consider the following chart, for example.
It shows something I call the 'outlier technique', where we look not for a pattern, but a signal from the noise of an existing pattern. In this case, the pattern is represented by a cluster of keyword phrases that fall into a clearly defined area, and a selection of keyword phrases that don't.

These outliers show phrases for which something out of the ordinary is happening.

The graph maps the SERP for each phrase against the CTR. The phrases in the bottom left, then, are those where competition is pretty stiff. This data is taken from the Search Console account for a client blog, so one of the first messages is that there are many instances where, despite appearing high in the SERPs, the CTR is low - not many people are choosing to click on the links, despite the entry appearing (on average) within the first three pages.

There are many reasons for this, but in some cases its that the page ranks for a keyword which is not represented by the content (heading & summary) to a sufficient extent that the searcher is convinced that a visit is worth their while.

At the other end of the spectrum, top-right of the graph, we have a phrase which, although it appears way down the SERPs, has a very high CTR.

Either the content is very good, or the market is very hungry. Both of these possibilities are worth exploring further: the next graph is a natural extension, and combines Keyword Planner data with the Search Console data, for the same set of keywords.

With this iteration, we have more signals. This time, we have captured the available traffic, a simple multiplication of the volume by our CTR. It's a simple metric, but already we see that there are three phrases that show important characteristics that make them worthy of engagement for future products, publications or traffic attracting content.

So, create your own metrics, and chase your own set of outliers, and you will be well on your way to making keyword research part of your own competitive advantage.