Here’s How to Supercharge Your Competitive Research Using a URL Profiler and Fusion Tables

Posted by Craig_Bradshaw

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

[Estimated read time: 19 minutes]

As digital marketers, the amount of data that we have to collect, process, and analyze is overwhelming. This is never more true than when we’re looking into what competitors are doing from a link building perspective.

Thankfully, there are a few things we can do to make this job a little bit easier. In this post, I want to share with you the processes I use to supercharge my analysis of competitor backlinks. In this post, you’ll learn:

  • How to use URL Profiler for bulk data collection
  • How to use fusion graphs to create powerful data visualizations
  • How to build an SEO profile of the competition using URL Profiler and fusion tables

Use URL Profiler for bulk data collection

Working agency-side, one of the first things I do for every new client is build a profile of their main competitors, including those who have a shared trading profile, as well as those in their top target categories.

The reason we do this is that it provides a top-level overview of the industry and how competitive it actually is. This allows us to pick our battles and prioritize the strategies that will help move the right needles. Most importantly, it’s a scalable, repeatable process for building links.

This isn’t just useful for agencies. If you work in-house, you more than likely want to watch your competitors like a hawk in order to see what they’re doing over the course of months and years.

In order to do this, you’re inevitably going to need to pull together a lot of data. You’ll probably have to use a range of many different tools and data points.

As it turns out, this sort of activity is where URL Profiler becomes very handy.

For those of you who are unfamiliar with URL Profiler is, it’s a bulk data tool that allows you to collect link and domain data from thousands of URLs all at once. As you can probably imagine, this makes it an extremely powerful tool for link prospecting and research.

URL Profiler is a brilliant tool built for SEOs, by SEOs. Since every SEO I know seems to love working with Excel, the output you get from URL Profiler is, inevitably, most handy in spreadsheet format.

Once you have all this amazing bulk data, you still need to be able to interpret it and drive actionable insights for yourself and your clients.

To paraphrase the great philosopher Ben Parker: with great data power comes great tedium. I’ll be the first to admit that data can be extremely boring at times. Don’t get me wrong: I love a good spreadsheet as much as I love good coffee (more on that later); but wherever possible, I’d much rather just have something give me the actionable insights I need.

This is where the power of data visualization comes into play.

Use fusion tables for powerful data visualization

Have you ever manually analyzed one million articles to see what the impact of content format and length has on shares on links? Have you ever manually checked the backlink profile of a domain that has over half a million links? Have you ever manually investigated the breakdown of clicks and impressions your site gets across devices? Didn’t think so.

Thanks to Buzzsumo & Moz, Majestic, Ahrefs, and the Google Search Console, we don’t have to; we just use the information they give us to drive our strategy and decision-making.

The reason these tools are so popular is they allow you to input your data and discern actionable insights. Unfortunately, as already mentioned, we can’t easily get any actionable insights from URL Profiler. This is where fusion tables become invaluable.

If you aren’t already familiar with fusion tables, then the time has come for you to get acquainted with them.

Back in 2012, Google rolled out an “experimental” version of their fusion tables web application. They did this to help you get more from your data and tell the story of what’s going on in your niche with less effort. It’s best to think of fusion tables as Google’s answer to big data.

There are plenty of examples of how people are using fusion tables to tell their stories with data. However, for the purpose of brevity, I only want to focus on one incredibly awesome feature of fusion tables — the network graph.

h8SDcTN.png

If fusion tables are Google’s answer to big data, then the network graph feature is definitely Google’s answer to Cerebro from X-Men.

I won’t go into too many details about what network graphs are (you can read more about them here), as I would much rather talk about their practical applications for competitive analysis.

Note: There is a fascinating post on The Moz Blog by Kelsey Libert about effective influencer marketing that uses network graphs to illustrate relationships. You should definitely check that post out.

I’d been using URL Profiler and fusion tables tools in isolation of each other for quite a while — and they each worked very well — before I figured out how to combine their strengths. The result is a process that combines the pure data collection power of URL Profiler with the actionable insights that fusion graphs provide.

I’ve outlined my process below. Hopefully, it will allow you to do something similar yourself.

Build a competitive SEO profile with URL Profiler and fusion tables

To make this process easier to follow, we’ll pretend we’re entering the caffeinated, yet delicious space of online coffee subscriptions. (I’ve chosen to use this particular niche in our example for no reason other than the fact that I love coffee.) Let’s call our hypothetical online coffee subscription company “Grindhaus.”

Step 1: Assess your competition

We’ll start by looking at the single keyword “buy coffee online.” A Google search (UK) gives us the top 10 that we’ll need to crack if we want to see any kind of organic progress. The first few results look like this: zjDG2Tc.png?1

Step 2: Gather your data

However, we’ve already said that we want to scale up our analysis, and we want to see a large cross-section of the key competitors in our industry. Thankfully, there’s another free tool that comes in handy for this. The folks over at URL Profiler offer a number of free tools for Internet marketers, one of which is called the SERP Scraper. No prizes for guessing what it does: add in all the main categories and keywords you want to target and hit scrape.

e3jAb81.png?1

As you can see from the image above, you can do this for a specific keyword or set of keywords. You can also select which country-specific results you want to pull, as well as the total number of results you want for each query.

It should only take a minute or so to get the results of the scrape in a spreadsheet that looks something like this:

sNko03Z.png

In theory, these are the competitors we’ll need to benchmark against in order for Grindhaus to see any sort of organic progress.

From here, we’ll need to gather the backlink profiles for the companies listed in the spreadsheet one at a time. I prefer to use Majestic, but you can use any backlink crawling tool you like. You’ll also need to do the same for your own domain, which will make it easier to see the domains you already have links from when it’s time to perform your analysis.

After this is done, you will have a file for your own domain, as well as a file for each one of the competitors you want to investigate. I recommend investigating a minimum of five competitors in order to obtain a data set large enough to obtain useful insights from.

Next, what we need to do is clean up the data so that we have all the competitor link data in one big CSV file. I organize my data using a simple two-column format, as follows:

  • The first column contains the competitor being linked to. I’ve given this column the imaginative heading “Competitor.”
  • The second column contains the domains that are linking to your competitors. I’ve labeled this column “URL” because this is the column header the URL Profiler tool recognizes as the column to pull metrics from.

Once you have done this, you should have a huge list of the referring domains for your competitors that looks something like this:

IjfGTeb.png

This is where the fun begins.

Step 3: Gather even more data

Next, let’s take each domain that is linking to one, some, or all of your competitors and run it through URL Profiler one at a time. Doing this will pull back all the metrics we want to see.

It’s worth noting that you don’t need any additional paid tools or APIs to use URL Profiler, but you will have to set up a couple of API keys. I won’t go into detail here on how to do this, as there are already plenty of resources explaining this readily available, including here and here. Vl6tUIQ.png?1

One of the added benefits of doing this through URL Profiler is that you can use its “Import and Merge” feature to append metrics to an existing CSV. Otherwise, you would have to do this by using some real Excel wizardry or by tediously copying and pasting extreme amounts of data to and from your clipboard.

As I’ve already mentioned, URL Profiler allows me to extract both page-level and domain-level data. However, in this case, the domain metrics are what I’m really interested in, so we’ll only examine these in detail here.

Majestic, Moz, and Ahrefs metrics

Typically, SEOs will pledge allegiance to one of these three big tools of the trade: Majestic, Moz, or Ahrefs. Thankfully, with URL Profiler, you can collect data from any or all of these tools. All you need to do is tick the corresponding boxes in the Domain Level Data selection area, as shown below. iIoJzQi.png

In most cases, the basic metrics for each of the tools will suffice. However, we also want to be able to assess the relevance of a potential link, so we’ll also need Topical Trust Flow data from Majestic. To turn this on, go to Settings > Link Metrics using the top navigation and tick the “Include Topical Trust Flow metrics” box under the Majestic SEO option.
JnUG72w.png

Doing this will allow us to see the three main topics of the links back to a particular domain. The first topic and its corresponding score will give us the clearest indication of what type of links are pointing back to the domain we’re looking at.

In the case of our Grindhaus example, we’ll most likely be looking for sites that scored highly in the “Recreation/Food” category. The reason we want to do this is because relevance is a key factor in link quality. If we’re selling coffee, then links from health and fitness sites would be useful, relevant, and (more likely to be) natural. Links from engineering sites, on the other hand, would be pretty irrelevant, and would probably look unnatural if assessed by a Google quality rater.

Social data

Although the importance of social signals in SEO is heavily disputed, it’s commonly agreed that social signals can give you a good idea of how popular a site is. Collecting this sort of information will help us to identify sites with a large social presence, which in theory will help to increase the reach of our brand and our content. In contrast, we can also use this information to filter out sites with a lack of social presence, as they’re likely to be of low quality.

Social Shares

Ticking “Social Shares” will bring back social share counts for the site’s homepage. Specifically, it will give you the number of Facebook likes, Facebook shares, Facebook comments, Google plus-ones, LinkedIn shares, and Pinterest pins.

Social Accounts

Selecting “Social Accounts” will return the social profile URLs of any accounts that are linked via the domain. This will return data across the following social networks: Twitter, Google Plus, Facebook, LinkedIn, Pinterest, YouTube, and Instagram.

Traffic

In the same way that sites with strong social signals give us an indication of their relative popularity, the same can also be said for sites that have strong levels of organic traffic. Unfortunately, without having direct access to a domain’s actual traffic figures, the best we can do is use estimated traffic.

This is where the “SEMrush Rank” option comes into play, as this will give us SEMrush’s estimation of organic traffic to any given domain, as well as a number of organic ranking keywords. It also gives us AdWords data, but that isn’t particularly useful for this exercise. pNgt3pH.png

It’s worth mentioning once more time that this is an estimate of organic traffic, not an actual figure. But it can give you a rough sense of relative traffic between the sites included in your research. Rand conducted an empirical study on traffic prediction accuracy back in June — well worth a read, in my opinion.

Indexation

One final thing we may want to look at is whether or not a domain is indexed by Google. If it hasn’t been indexed, then it’s likely that Google has deindexed the site, suggesting that they don’t trust that particular domain. The use of proxies for this feature is recommended, as it automatically queries Google in bulk, and Google is not particularly thrilled when you do this! pw4DOYa.png

After you’ve selected all the metrics you want to collect for your list of URLs, hit “Run Profiler” and go make yourself a coffee while it runs. (I’d personally go with a nice flat white or a cortado.)

For particularly large list of URLs, it can sometimes take a while, so it would probably be best to collect the data a day or two in advance of when you plan to do the analysis. For the example in this post, it took around three hours to pull back data for over 10,000 URLs. But I could have it running in the background while working on other things.

Step 4: Clean up your data

One of the downsides of collecting all of this delicious data is that there are invariably going to be columns we won’t need. Therefore, once you have your data, it’s best to clean it up, as there’s a limit on the number of columns you can have in a fusion table. CXFldtb.png

You’ll only need the combined results tab from your URL Profiler output. So you can delete the results tab, which will allow you to re-save your file in CSV format.

Step 5: Create your new fusion table

Head on over to Google Drive, and then click New > More > Google Fusion Tables. zbULZzA.png

If you can’t see the “Google Fusion Tables” option, you’ll have to select the “Connect More Apps” option and install Fusion Tables from there: nffgrIL.png

From here, it’s pretty straightforward. Simply upload your CSV file and you’ll then be given a preview of what your table will look like.

Click “Next” and all your data should be imported into a new table faster than you can say “caffeine.”
VwO62dA.png

WSpdPNN.png

Step 6: Create a network graph

Once you have your massive table of data, you can create your network graph by clicking on the small red “+” sign next to the “Cards” tab at the top of your table. Choose “Add Chart” and you’ll be presented with a range of chart options. The one we’re interested is the network graph option: DadqMBW.png

Once you’ve selected this option, you’ll then be asked to configure your network graph. We’re primarily interested in the link between our competition and their referring domains.

However, the relationship only goes in one direction: I, the referring website, give you, the retailer, a link. Thus the connection. Therefore, we should tick the “Link is directional” and “Color by columns” options to make it easier to distinguish between the two.

By default, the network graph is weighted by whatever is in the third column — in this case, it’s Majestic CitationFlow, so our blue nodes are sized by how high the CitationFlow is for a referring domain. Almost instantly, you can spot the sites that are the most influential based on how many sites link to them.

This is where the real fun begins.

One interesting thing to do with this visualization that will save you a lot of time is to reduce the number of visible nodes. However, there’s no science to this, so be careful you’re not missing something. wzwURXr.png

As you increase the number of nodes shown, more and more blue links begin to appear. At around 2,000 nodes, it’ll start to become unresponsive. This is where the filter feature comes in handy, as you can filter out the sites that don’t meet your chosen quality thresholds, such as low Page Authority or a large number of outbound links.

So what does this tell us — other than there appears to be a relatively level playing field, which means there is a low barrier to entry for Grindhaus?

This visualization gives me a very clear picture of where my competition is getting their links from. adaFRBx.png

In the example above, I’ve used a filter to only show referring domains that have more than 100,000 social shares. This leaves me with 137 domains that I know have a strong social following that would definitely help me increase the reach of my content.

You can check out the complete fusion table and network graph here.

Step 7: Find your mutant characteristics

Remember how I compared network graphs to Google’s answer to Cerebro from X-Men? Well, this is where I actually explain what I meant.

For those of you that are unfamiliar with the X-Men universe, Cerebro is a device that amplifies the brainwaves of humans. Most notably, it allows telepaths to distinguish between humans and mutants by finding the presence of the X-gene in a mutant’s body.

Using network graphs, we can specify our own X-gene and use it to quickly find high-quality and relevant link opportunities. For example, we could include sites that have a Domain Authority greater than or equal to 50:

81Wu6Zp.png

For Grindhaus, this filter finds 242 relevant nodes (from a total of 10,740 total nodes). In theory, these are domains Google would potentially see as being more trustworthy and authoritative. Therefore, they should definitely be considered as potential link-building opportunities.

You should be able to see that there are some false positives in here, including Blogspot, Feedburner, and Google. However, these are outweighed by an abundance of extremely authoritative and relevant domains, including Men’s Health, GQ Magazine, and Vogue.co.uk.

Sites that have “Recreation/Food” as their primary Topical Trust Flow Topic:

rp5JT4o.png

This filter finds 361 relevant nodes out of a total of 10,740 nodes, which all have “Recreation/Food” as their primary Topical Trust Flow Topic.

Looking at this example in more detail, we see that another cool feature of network graphs is that the nodes that have the most connections are always in the center of the graph. This means you can quickly identify the domains that link to more than one of your competitors, as indicated by the multiple yellow lines. This works in a similar way to Majestic’s “Click Hunter” feature and Moz’s “Link Intersect” tool.

However, you can do this on a much bigger scale, having a wider range of metrics at your fingertips.

qFP2gro.png

In this case, toomuchcoffee.com, coffeegeek.com, and beanhunter.com would be three domains I would definitely investigate further in order to see how I could get a link from them for my own company.

Sites that are estimated to get over 100,000 organic visits, weighted by social shares:

1ui0EZa.png

For our Grindhaus, this filter finds 174 relevant nodes out of 10,740, which are all estimated to receive more than 100,000 organic visits per month. However, I have also weighted these nodes by “Homepage Total Shares.” This allows me to see the sites that have strong social followings and have also been estimated to receive considerable amounts of organic traffic (i.e., “estimorganic” traffic).

By quickly looking at this network graph, we can immediately see some authoritative news sites such as The Guardian, the BBC, and the Wall Street Journal near the center, as well as quite a few university sites (as denoted by the .ac.uk TLD).

Using this data, I would potentially look into reaching out to relevant editors and journalists to see if they’re planning on covering National Coffee Week and whether they’d be interested in a quote from Grindhaus on, say, coffee consumption trends.

For the university sites, I’d look at reaching out with a discount code to undergraduate students, or perhaps take it a bit more niche by offering samples to coffee societies on campus like this one.

This is barely scratching the surface of what you can do with competitor SEO data in a fusion table. SEOs and link builders will all have their own quality and relevance thresholds, and will also place a particular emphasis on certain variables, such as Domain Authority or total referring domains. This process lets you collect, process, and analyze your data however you see fit, allowing you to quickly find your most relevant sites to target for links.

Step 8: Publish and share your amazing visualization

Now that you have an amazing network graph, you can embed it in a webpage or blog post. You can also send a link by email or IM, which is perfect for sharing with other people in your team, or even for sharing with your clients so you can communicate the story of the work you’re undertaking more easily.

Note: Typically, I recommend repeating this process every three months.

Summary and caveats

Who said that competitive backlink research can’t be fun? Aside from being able to collect huge amounts of data using URL Profiler, with network graphs you can also visualize the connections between your data in a simple, interactive map.

Hopefully, I’ve inspired you to go out and replicate this process for your own company or clients. Nothing would fill me with more joy than hearing tales of how this process has added an extra level of depth and scale to your competitive analysis, as well as given you favorable results.

However, I wouldn’t be worth my salt as a strategist if I didn’t end this post with a few caveats:

Caveat 1: Fusion tables are still classed as “experimental,” so things won’t always run smoothly. The feature could also disappear altogether overnight, although my fingers (and toes) are crossed that it doesn’t.

Caveat 2: Hundreds of factors go into Google’s ranking algorithm, and this type of link analysis alone does not tell the full story. However, links are still seen as an incredibly important signal, which means that this type of analysis can give you a great foundation to build on.

Caveat 3: To shoehorn one last X-Men analogy in… using Cerebro can be extremely dangerous, and telepaths without well-trained, disciplined minds put themselves at great risk when attempting to use it. The same is true for competitive researchers. However, poor-quality link building won’t result in insanity, coma, permanent brain damage, or even death. The side effects are actually much worse!

In this age of penguins and penalties, links are all too often still treated as a commodity. I’m not saying you should go out and try to get every single link your competitors have. My emphasis is on quality over quantity. This is why I like to thoroughly qualify every single site I may want to try and get a link from. The job of doing competitive backlink research using this method is to assess every possible option and filter out the websites you don’t want links from. Everything that’s left is considered a potential target.

I’m genuinely very interested to hear your ideas on how else network graphs could be used in SEO circles. Please share them in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1ZMufYI
via IFTTT

Advertisements

What You Should Know About Accessibility + SEO, Part I: An Intro

Posted by Laura.Lippay

[Estimated read time: 4 minutes]

Do you know anyone who is visually impaired? Maybe they have low vision or color blindness, or are fully blind. Think about how they use the Internet. Close your eyes, or at least squint really hard, and try to find today’s news or interact with your friends on Facebook. It’s a challenge many of us don’t think about every day, but some of what we do in SEO can affect the experience that people with visual impairments have when visiting a page.

Accessibility and the Internet

accessibilitymac.gif

Visually impaired Internet users are able to navigate and use the web using screen readers like VoiceOver or Jaws. Screen readers, much like search engine crawlers, rely on signals in the code to determine the structure and the context of what they’re crawling. The overlap in what search crawlers look for and interpret versus what screen readers look for and interpret is small, but the idea is the same: Where are the elements of this page and how do I understand them?

The SEO overlap

While it’s important to understand where SEO and accessibility (a11y) overlap in order to optimize correctly for both, it’s also important to note that optimizing for one is not necessarily akin to optimizing for the other. In other words, if you’ve optimized a page for search engines, it doesn’t mean you’ve necessarily made it accessible — and vice versa.

Recently, web accessibility expert Karl Groves wrote a post called The Accessibility & SEO Myth. Mr. Groves knows the world of accessibility inside and out, and knows that optimizing for accessibility, which goes far beyond optimizing for the visually-impaired, is very different overall, and much more complex (strictly from a technical standpoint) than optimizing for search engines. He’s right — that despite the ways SEO and a11y overlap, a11y is a whole different ballgame. But if you understand the overlap, you can successfully optimize for both.

Here are just some examples of where SEO and accessibility can overlap:

  • Video transcription
  • Image captioning
  • Image alt attributes
  • Title tags
  • Header tags (H1, H2, etc)
  • Link anchor text
  • On-site sitemaps, table of contents, and/or breadcrumbs
  • Content ordering
  • Size and color contrast of text
  • Semantic HTML

If you’re developing the page yourself, I would challenge you to learn more about the many things you can do for accessibility beyond where it overlaps with SEO, like getting to know ARIA attributes. Take a look at the W3C Web Content Accessibility Guidelines and you’ll see there are far more complex considerations for accessibility than what we typically consider for technical SEO. If you think technical SEO is fun, just wait until you get a load of this.

Optimizing for accessibility or SEO?

Chances are, if you’re optimizing for accessibility, you’re probably covering your bases for those technical optimizations where accessibility and SEO overlap. BUT, this doesn’t always work the other way around, depending on the SEO tactics you take.

Thankfully, the Converse site has a pretty descriptive alt attribute in place!

Consider a screen reader reaching an image of a pair of women’s black Chuck Taylor All-Star shoes and reading its alt attribute as “Women’s black Chuck Taylor All-Stars buy Chucks online women’s chuck taylors all-stars for sale.” Annoying, isn’t it? Or compare these page titles with SEO and accessibility in mind: “Calculate Your Tax Return” versus “Online Tax Calculator | Tax Return Estimator | Tax Refund/Rebate.” Imagine you just encountered this page without being able to see the content. Which one more crisply and clearly describes what you can expect of this page?

While it’s nice to know that proper technical search engine optimization will affect how someone using a screen reader can contextualize your site, it’s also important to understand (1) that these two optimization industries are, on a bigger level, quite different, and (2) that what you do for SEO where SEO and a11y overlap will affect how some visitors can (or can’t) understand your site.

Marking the 5th Global Accessibility Awareness Day https://t.co/QiJyAM8xGW 05/19? Let us know how Globala11yawarenessdayAtGmailDotCom #gaad
— GblA11yAwarenessDay (@gbla11yday) March 14, 2016

For Global Accessibility Awareness Day on May 19, I’ll be collaborating with some experts in a11y on a post that will go into more details on what aspects of SEO + a11y to be keenly aware of and how to optimize for both. I’ll be sure to find as many examples as I can — if you’ve got any good ones, please feel free to share in the comments (and thanks in advance).

Educational resources & tools

In the meantime, to learn more about accessibility, check out a couple of great resources:

  • W3C Techniques for WCAG 2.0
  • ARIA attributes
  • Web Accessibility Evaluation Tools List
  • Web Accessibility Tools & events from the Paciello Group, a well-known accessibility agency
  • Also check out The Paciello Group’s blog. It’s digestible bits of good info
  • Ted Drake’s “css toys for professional web developers“ blog, heavy on accessible dev & accessible infographics!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/22NxzYU
via IFTTT

The Guide to International Website Expansion: Hreflang, ccTLDs, & More!

Posted by katemorris

Growth. Revenue, visits, conversions. We all want to see growth. For many, focusing on a new set of potential customers in another market (international, for instance) is a source of growth. It can sometimes seem like an easy expansion. If your current target market is in the US, UK, or Australia, the other two look promising. Same language, same content — all you need is to set up a site for them and target it at them, right?

International expansion is more complicated than that. The ease of expansion depends highly on your business, your resources, and your customers. How you approach expansion and scale it over time takes consideration and planning. Once you’ve gone down a path of URL structure and a process for marketing and content, it’s difficult to change.

This guide is here to help you go down the international expansion path on the web, focused on ensuring your users see the right content for their query in the search engines. This guide isn’t about recommendations for translation tools or how to target a specific country. It is all about international expansion from a technical standpoint that will grow with your business over time.

At the end is a bonus! A flow chart to help you troubleshoot international listings showing up in the wrong place in the SERPs. Have you ever wondered why your Canadian page showed for a user in the US? This will help you figure that out!

Before we begin: Terminology

ccTLD – A country-specific top-level domain. These are assigned by ICANN and are geo-targeted automatically in Google Search Console.

gTLD – A generic top-level domain. These are not country-specific and if used for country-specific content, they must be geo-target inside of Google Search Console or Bing Webmaster Tools. Examples include .com, .net, and .tv. Examples from Google found here.

Subdomain – A major section of a domain, distinguished by a change to the characters before the root domain. The most-used standard subdomain is www. Many sites start with http://www.domain.com as their main subdomain. Subdomains can be used for many reasons: marketing, region targeting, branded micro sites, and more.

Subfolder – A section of a subdomain/domain. Subfolders are sections marked by a trailing slash. Examples include http://ift.tt/1DZqpSp, or in terms of this guide, http://www.domain.com/en or http://www.domain.ca/fr.

Parameter – A modifier of a URL that either tracks a path of a user to the content or changes the content on the page based on the parameters in the URL. These are often used to indicate the language of a page. An example is http://ift.tt/1qd0mVq, with lang being the parameter.

Country – A recognized country that has a ccTLD by ICANN or an ISO code. Google uses ISO 3166-1 Alpha-2 for hreflang.

Region – Collections of countries that the general public groups together based on geography. Examples include the EU or the Middle East. These are not countries and cannot be geo-targeted at this time.

Hreflang – A tag used by Google to allow website owners to indicate that a specific page has a copy in another language. The tags indicate all other translated versions of that page along with the language. The language tags can have regional dialects to distinguish between language differences like British English and American English. These tags can reside on-page or in XML sitemaps.

Meta language – The language-distinguishing tag used by Bing. This tag merely informs Bing of the language of the current page.

Geo-targeting – Both Bing Webmaster Tools and Google Search Console allow website owners to claim a specific domain, subfolder, or subdomain, and inform the search engine that the content in that domain or section is developed for and targeted at the residents of a specific country.

Translation – Changing content from one language or regional dialect to another language or regional dialect. This should never be done with a machine, but rather always performed by someone fluent in that language or regional dialect.

Understanding country and language targeting

The first step in international expansion planning is to determine your target. There is some misunderstanding between country targeting and language targeting. Most businesses start international expansion wanting to do one of two things:

  1. Target users that speak another language.
    Example – A business in Germany: “We should translate our content to French.”
  2. Target users that live in another part of the world.
    Example – A business in Australia: “We should expand into the UK.”

False associations: Country and language

The first issue people run into is associating a country and a language. Many of the world’s top languages have root countries that share the same name; specifically, France/French, Germany/German, Portugal/Portuguese, Spain/Spanish, China/Chinese, Japan/Japanese, and Russia/Russian. Many of these languages are used in a number of other countries, however. Below is a list of the top languages used by Internet users.

Click to open a bigger version in a new tab!

Please note this is not the list of top languages in the world; that is a vastly different list. This list is based on Internet usage. And there are some languages that only have one country set as the official language, but users exist in other countries that browse the Internet with that language as their preferred language. An example might be a Japanese national working in the US setting up a new office.

Another note is that the “main” country chosen above is what country is the originator of the language (English) or what country shares a name with/is close to the language name. This is how many people associate languages and countries in most instances, but those assumptions are not correct.

Flags and languages

We must disassociate languages and countries. There are too many times when a country flag is used to note a language change on a site. Flags should only be used when the country is being targeted, not the language.

Click to open a bigger version in a new tab!

Web technology and use impacts targeting

The second issue arises in the execution. The business in Germany from the first few examples might hire a translator from France and translate their content to French. From there, the targeting can get confused based on where that content is placed and how it is tagged.

Below are some implementations of posting the translated content we might see by the business. This table looks at a variety of combinations of ccTLDs, gTLDs, subfolders, subdomains, hreflang tagging, and geo-targeting. Each combination of URL setup and tagging results in different targeting according to search engines and how that can impact the base number of Internet users in that group.

Click to open a bigger version in a new tab!

Given the above, you can see that the implementation is not as straightforward as it might seem. There’s no single right answer in the above possible implementations. However, many of them change the focus of the original target market (speakers of the French language) and that has an impact on the base target market.

International search strategy tool

This is what many of us face when trying to do international expansion. There is conflicting data on what should be done. This is why I developed a tool to help businesses determine which route they should take in international expansion. It helps them determine what their real focus should be (language, country, or if they need to use both) and narrows down the list of choices above while understanding their business needs, resources, and user needs. It’s developed over the years from a flow chart, to a poorly designed tool, to a better-structured tool found by clicking the link in the image below.

Start with those questions and then come back here when you have other questions. That’s what the rest of this guide is about. It’s broken down into three types of targeting:

  1. Language
  2. Country
  3. Hybrid (multiple countries with multiple languages)

No one type is easier than another. You really need to choose the path early on and use what you know of your business, user needs, and resources.

Language targeting

Language-only targeting can seem like the easiest route to take, as it doesn’t require a major change and multiple instances of marketing plans. Country-focused targeting requires new targeted content to each targeted country. There are far fewer languages in the world than countries. In addition, if you target the major world languages, you could potentially start with a base of millions of users that speak those languages.

However, language targeting involves two very tricky components: translation and language tagging. If either of these components are not done right, it can cause major issues with user experience and indexation.

Translation

The first rule of working with languages and translation is NEVER machine translate. Machine translation is highly inaccurate. I was just at an all-inclusive resort in Mexico, and you could tell the translations were done by a machine, not a person. Using machine translations produces a very poor user experience and poor SEO targeting as well.

Translations of content should always be done by a human who is fluent both in that language and the original language of the content. If you are dealing with regional variations, it is recommended to get someone that is native to and/or living in that area to translate, as well as being fluent.

Spending the right resources on translation will ensure the best user experience and the most organic traffic.

Language tagging: Hreflang and meta language

When you hear about translation and international expansion, the first thing people think about is the hreflang tag. Relative to the Internet, the hreflang tag is new. This launched in late 2010. It is only used by Google as of when this post was written. If the bulk of your traffic comes from Google and you are translating only, this is of use to you. However, do know that Bing uses a different tag format, called the meta language tag.

  • Guide for implementing hreflang
  • Guide for implementing meta language

Tips: Ensure that there’s an hreflang tag on every page that’s translated to every other translated instance of that page. I prefer the tags be put in XML sitemaps (instructions here) to keep the tagging off the page, as any removal of code increases page load time, no matter how small. Do what works for your team.

What about x-default?

One of the tagging mistakes that happens most often is using x-default. Many people misunderstand its use. X-default was added to the hreflang markup family to help Google serve un-targeted pages, like those from IKEA and FedEx, to users that don’t have language-targeted content on that site or Google doesn’t know where to place them. This tag is not meant to set the “original” page.

Checking for tagging issues

Once you have your tagging live (or on a testing server that is crawlable by Google but not indexable), you can check for issues inside of Google Search Console. This will let you know what tag issues you are having and where they’re located.

URL selections

Choosing the URL structure of your language extensions is totally up to you. If you are focusing on language targeting only, don’t use a ccTLD. Those are meant for targeting a specific country, not a language. ccTLDs automatically geo-target and that selection cannot be changed. Your other choices are subfolder, subdomain, and parameter. They’re listed below in order of my professional preference and why.

  1. Subfolders provide a structure that’s easier to build upon and develop as your site and business grows and changes. You might not want to target specific countries now or have the resources, but you may someday. Setting up a subfolder structure allows you to use the same structure for any future ccTLDs or subdomains for country sections in the future. Your developers will appreciate this choice because it’s scalable for hreflang tags, as well.
  2. Parameters allow a backup system in case your tagging fails in a site update in the future. Parameters can be defined in Google as being used to modify the language on the page. If your other tags are lost, that parameter setting is still telling Google that the content is being translated.
    Using a parameter for language is also scalable for future plans and easy for tagging, like subfolders. The downsides are that they’re ugly and might accidentally be negated by a misplaced rel canonical tag in the future.
  3. Subdomains for language targeting is my least favorite option. Only use this if it’s the only option you have, by decree of your technical team. Using subdomains for languages means that if you change plans to target countries in the future, you’ll lose many options for URLs there. To follow the same structure for each country, you would need to use ccTLDs; while those are the strongest signal for geo-targeting, they are also the option that requires the most investment.

Notice that ccTLDs are not on this list. Those are only for geo-targeting. Unless you’re changing your content to focus on a specific country, do not use ccTLDs. I say this multiple times for a reason: too many websites make this mistake.

Detecting languages

Many companies want to try to make the website experience as easy as possible for the user. They attempt to detect the user’s preferences without needing input from the user. This can cause problems with languages.

There are a few ways to try to determine a user’s language preferences. The most-used are browser settings and IP address. It is not recommended to ever use the IP address for language detection. An IP address can show an approximate user location, but not their preferred language. The IP address is also highly inaccurate (just the other day I was “in” North Carolina and live in Austin) and Google still only crawls from a US IP address. Any automatic redirects based on IP should be avoided.

If you choose to try to guess at the user’s language preference when they enter your site, you can use the browser’s language setting or the IP address and ask the user to confirm the choice. Using JavaScript to do this will ensure that Googlebot does not get confused. Pair this with a good XML sitemap and the user can have a great interaction. Plus, the search engines will be able to crawl and index all of your translated content.

Country targeting, AKA geo-targeting

If your business or content changes depending on the location of the user, country targeting is for you. This is the most common answer for those businesses in retail. If you offer a different set of products, if you have different shipping, pricing, grouping structure, or even different images and descriptions, this is the way to go.

Example: If a greeting card business in the US wanted to expand to Australia, not only are the prices and products different (some different holidays), the Christmas cards are VASTLY different. Think of Christmas in summer, as it is in Australia, and only being able to pick from cards with winter scenes!

Don’t go down the geo-targeting route if your content or offerings don’t change or you don’t have the resources to change the content. If you launch country-targeted content in any URL structure (ccTLD, subdomain, or subfolder) and the content is identical, you run the risk of users coming across another country’s section.

Check out the flow chart at the end to help figure out why one version of your site might be ranking over another.

Example: As a web development service in Canada, you want to expand into the US. Your domain at the moment is http://www.webdevexpress.ca (totally made up!). You buy http://www.webdevexpress.us (that’s the ccTLD for the US, by the way). Nothing really needs to change, so you just use the same content and go live. A few months down the road, US clients are still seeing http://www.webdevexpress.ca when they do a brand name search. The US domain is weaker (fewer links, mentions, etc.) and has the same content! Google is going to show the more relevant, stronger page when everything is the same.

Regions versus countries

Knowing what country or which countries you want to focus on in expansion is usually decided before you determine how to get there. That’s what spawns the conversation.

There’s one misconception that can throw off the whole process of expansion, and that is that you can target a region with geo-targeting. As of right now, you can purchase a regional top-level domain like .eu, but those are treated as general top-level domains like .com or .net.

The search engines only operate geo-targeting in terms of countries right now. The Middle East and the European Union are collections of countries. If you set up a site dedicated to a region, there are no geo-targeting options for you.

One workaround is to select a primary country in that region, perhaps one in which you have offices, and geo-target to that country. It’s possible to rank for terms in that primary language in surrounding countries. We see this all the time with Canada and the US. If the content is relevant to the searcher, it’s possible to rank no matter the searcher.

Example: If you’re anywhere other than the UK, Google “fancy dress” — you see UK sites, right? At least in the US, “fancy dress” is not a term we use, so the most relevant content is shown. I can’t think of a good Canadian/US term, but I guarantee there are some out there!

URL selections

The first thing to determine in geo-targeting beyond the target countries is URL structure. This is immensely important because once you choose a structure, every country expansion should follow that. Changing URL structure in the future is difficult and costly when it comes to short-term organic traffic.

In order of my professional preference, your choices are:

  1. Subfolders. As with the language/translation option, this is my preferred setup, as it utilizes the same domain and subdomain across the board. This translates to utilizing some of the power you already built with other country-focused areas (or the initial site). This setup works well for adding different translations within one country (hybrid approach) down the line.
    Note: If you go with subfolders on both, always lead with the country, then language down the line.

    Example: http://www.domain.com/us/es (US-focused, in Spanish language) or http://www.domain.com/ca/fr (Canada-focused, in Canadian French).

  2. ccTLDs. This is the strongest signal that you’re focusing your content on a specific country. They geo-target automatically (one less step!), but that has a downside as well. If you started with a ccTLD and expanded later, you can’t geo-target a subfolder within a ccTLD at this point in time.
    Example: http://www.domain.ca/us will not work to target the US. The target will remain Canada. It might rank in the US, depending on the term competition and relevance, but you can’t technically geo-target the /us subfolder within the Canadian ccTLD.
  3. Subdomains. My last choice, because while you’re still on the same root domain, there’s that old SEO part of me that thinks a subdomain loses some equity from the main domain. BUT, if your tech team prefers this, there’s nothing wrong with using a subdomain to geo-target. You’ll need to claim each subdomain in Search Console and Bing Webmaster Tools and set the geo-target for each, just as you would with subfolders.
    Example: gb.domain.com

Content changes

The biggest question asked when someone embarks on country-targeting expansion is: “How much does my content need to change to not be duplicated?” In short — there is no magic number. No metric. There isn’t a number of sentences or a percentage. How much your content needs to change per country site or subsite is entirely up to your target market and your business.

You’ll need to do research into your new target market to determine how your content should change to meet their needs. There are a number of ways you might change your content to target a new country. The most common are:

Product differentiation

If you offer a different set of products or services to different countries by removing those that are not in demand, outlawed, or otherwise not wanted, or by adding new products for that country specifically, that is changing your site content.

Example #1: Amazon sells the movie “Elf” in the US and the UK, but they are different products. DVDs in Europe are coded for Europe and might not play on US players.

Example #2: Imagine you’re a drugstore in the UK and want to expand to the US. One of your products, 2.5% Selenium Sulphide, is not approved for use in the US. This is one among hundreds or thousands of products that are different.

Naming schema

The meaning of product names can change in different countries. How a specific region terms a product or service can change as well, making it necessary to change your product or service naming schema.

Keyword usage

Like the above, the words you use to describe your products or services might change in a new country. This can look like translation, but if it’s the change of just a few terms, it’s not considered full translation. There’s a fine line between these two things. If you realize that the only thing you’re changing is the wording between US and UK English, for example, you might not need to geo-target at all and mark the different pages as translations.

Keyword use change example: “Mum” versus “Mom” or “Mother” when it comes to Happy Mother’s Day cards. You need to offer different cards in this and other categories because of the country change. This is more than a word change, so it’s a case of geo-targeting — not just translation.

Translation change example: Etsy.com. Down at the bottom of the page, you can change your language setting. I set mine to UK English, and words like “favourite” started to show up. If this sounds like what you would need to do and your content would not change otherwise (Etsy shows all content to all users regardless of their location), consider translation only.

Pricing structure

Many times, one of the most common things that change in country-specific content is pricing. There’s the issue of different currency, but more than that, different countries have different supply and demand markets that should and will change your pricing structure.

Imagery changes

When dealing with different cultures, sometimes you find the need to change your site imagery. If you’ve never explored psychology, I highly recommend checking out The Web Psychologist – Nathalie Nahai and some of her talks. Understanding your new target market’s culture is imperative to marketing effectively.

Example: Samsung changes the images on their UK versus China sites to change the focus from an individualistic to a collectivistic culture. See my presentation at SearchLove San Diego for more examples.

Laws, rules, and regulations

One of the most important ways to change your content is to satisfy the local laws and regulations. This is going to depend on each business. You might deal with tons, while others might deal with none. Check out local competitors — the biggest you can identify — to see what you might need to do.

Example: If you move into the UK and set cookies on your visitor’s machine, you have to alert them to the use of cookies. This is not a law in the US and is easily missed.

User experience and IP redirects

When people start moving into other countries, one of the things they want to ensure is that users get to the right content. This is especially important when products change and the purchase of an incorrect product would cause issues for the user, or the product isn’t available to them. Your customer service, user experience, or legal team is going to ask that you redirect users to the correct country. Everyone gets to the right place and the headaches lessen.

There isn’t anything wrong with asking a user to select the country they reside in and set a cookie, but many people don’t want to bother their users. Therefore, they detect the user’s IP address and then force a redirect from there. There are two problems with this setup.

  1. IP addresses are inaccurate – I was in Seattle, WA once and my IP had me in Washington, DC. No kidding. Look at that distance on a map. Think about that distance in terms of Europe and how much might change there.
  2. Google crawls from California – For the time being, using an IP-based forced redirect will ensure your international content is not indexed. Google will only ever see the US content if you do a forced redirect.

You can deal with this by detecting the country-using IP address (or if organic traffic, what version of Google they came from) and using a JavaScript popup to ask what their preferred country is, then set a cookie with that preference. Even if the user clicks on another country’s content in the future, they will be redirected to their own.

No hreflang??

If you went through that tool, you noticed that my geo-targeting plan does not include hreflang. Many other people disagree with me on this point, saying that the more signals you can send, the better.

Before I get into why I don’t recommend setting up hreflang between country targeted sub-sites, let me make one thing clear. Setting up hreflang will not hurt your site if you are really focusing on country targeting and it’s not that intricate of a setup yet (more on that later). Let’s say you’re in Canada and want to open a US-targeted site. Your content changes because your products change, your prices change, your shipping info changes. You create domain.com/us and geo-target it to the US. You can add hreflang between each page that is the same between the two sub-sites — two products that exist in both locations, for example. The hreflang will not hurt.

Example: If you don’t have the resources to change your content at the moment to fully target the UK, only translate your content a bit between your US (domain.com) and UK (domain.co.uk), and have plans to change your content down the road, an hreflang tag between those two ccTLDs can help Google understand the content change and who you’re targeting.

Why I don’t recommend hreflang for geo-targeting only

Hreflang was meant to help Google understand when two pages are exactly the same, but translated. It works much like a canonical tag (which is why using another canonical can be detrimental to the hreflang working) in which you have multiple versions of one page with slight changes.

Many people get confused because there’s the ability to use country codes in the hreflang tags. This is for when you need to tell Google of a dialect change. An example would be if you have two sub-sites that are identical, but the American English has been changed to British English. It’s not meant to inform Google that content that’s targeted at a different country is targeted at that country.

When I recommend geo-targeting only, I make it very clear to clients that going down this route means you really need to change the content. International business is so much more than just translation. Translating content only might hurt your conversion rates if you miss some aspect of the new target market.

Hiring content writers in that country that understand the nuances is very important. I worked for a British company for 4 years, so I get some of the differences, but things continually surprise me still. I would never feel comfortable as an American writing content for a British audience.

I also don’t recommend hreflang in most geo-targeting cases, because the use of geo-targeting and hreflang can get really confusing. This has led to incorrect hreflang tags in the past that have wreaked havoc on Google’s understanding of the site structure.

Example: A business starts off with a Canadian domain (domain.ca) and a France domain (domain.fr). They use hreflang between the English for Canada and French for France using the code below. They then add a US site and the code is modified to add a line for the US content.

<link rel="alternate" hreflang="en" href="http://domain.ca/" />
<link rel="alternate" hreflang="fr" href="http://domain.fr/" />
<link rel="alternate" hreflang="en-us" href="http://domain.com/" />

This looks odd because there is one English-language page with no regional modifications that is on a Canadian-targeted domain. There is a US regional English dialect version on a general top-level domain (as .com is general and is not US-specific, but people use it that way).

Remember, this is a bot that’s trying to logic out a structure. For a user that prefers UK English, there is no logical choice. The general English is a Canadian site and the general TLD is in US English. This is where we get some of the inconsistencies with international targeting.

You might be saying things like “That would never happen!” and “They should have changed the first English to Canadian English (en-ca)!”, but if you’ve ever dealt with hurried developers (they really do have at least 50 requests at once sometimes) you’ll know that they, like search bots, prefer consistency.

Hreflang should not be needed in geo-targeting cases because, if you’re really going to target a new country-specific market, you should treat them as a whole new market and create content just for them. If you can’t, or don’t think it’s needed, then providing language translations is probably all you need to do at the moment. And hreflang in geo-targeting cases can cause confusion with code that might confuse the search engines. The less we can confuse them, the better the results are!

Hybrid targeting

Finally, there is the route I call “hybrid,” or utilizing both geo-targeting and translation. This is what most major retail corporations should be doing if they’re international. Due to laws, currency, market changes, and cultural changes, there is a big need for geo-targeted content. But in addition to that, there are countries that require multiple language versions. There might be anywhere from one to a few hundred used languages in a single country! Here are the top countries that use the web and how many recognized languages are used in each.

Click to open a bigger version in a new tab!

Do you need to translate into all 31 languages used in the US? Probably not. But if 50% of your target market in Canada prefers Canadian French as their primary language, the translation investment might be a good one.

In cases where a geo-targeted site (ccTLD use) or sub-site (subdomain or subfolder) needs more than one language, then there is the need to geo-target the site or sub-site and then use hreflang within that country-specific site.

This statement can be confusing, so let me show you what I mean:

Click to open a bigger version in a new tab!

This requires a good amount of planning and resources, so if you need to embark on this path in the future, start setting up the structure now. If you need to go the hybrid route, I recommend the following URL structures for language and country targeting. As with before, these are in order of my professional preference and are all focused on content targeted to Canada in Canadian French.

(Country structure/Language structure)

  1. Subfolder/Subfolder
    Example: domain.com/ca/fr
  2. Subfolder/Parameter
    Example: http://ift.tt/1RwM5tE
  3. ccTLD/Subfolder
    Example: domain.ca/fr
  4. ccTLD/Parameter
    Example: http://ift.tt/1qd0swh
  5. Subdomain/Subfolder
    Example: ca.domain.com/fr
  6. Subdomain/Parameter
    Example: http://ift.tt/1RwM5JS
  7. ccTLD/Subdomain (not recommended, nor are the other combinations I intentionally left out)
    Example: fr.domain.ca

The hybrid option is where the hreflang setup can get the most messed up. Make sure you have mapped everything out before implementing, and ensure you’re considering future business plans as well.

I hope this helps clear up some of the confusion around international expansion. It really is specific to each individual business, so take the time to plan and happy expansion!

Troubleshooting International SEO: A flowchart

Click to open a bigger version in a new tab!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1RH4Tp5
via IFTTT

Are Keywords Really Dead? An Experiment

Posted by sam.nemzer

[Estimated read time: 6 minutes]

A quantitative analysis of the claim that topics are more important than keywords.

What’s more important: topics or keywords? This has been a major discussion point in SEO recently, nowhere more so than here on the Moz blog. Rand has given two Whiteboard Fridays in the last two months, and Moz’s new Related Topics feature in Moz Pro aims to help you to optimize your site for topics as well as keywords.

The idea under discussion is that, since the Hummingbird algorithm update in 2013, Google is getting really good at understanding natural language. So much so, in fact, that it’s now able to identify similar terms, making it less important to worry about minor changes in the wording of your content in order to target specific keyword phrases. People are arguing that it’s more important to think about the concepts that Google will interpret, regardless of word choice.

While I agree that this is the direction that we’re heading, I wanted to see how true this is now, in the present. So I designed an experiment.

The experiment

The question I wanted to answer was: “Do searches within the same topic (but with different keyword phrases) give the same result?” To this end, I put together 10 groups of 10 keywords each, with each group’s keywords signifying (as closely as possible) the same concept. These keywords were selected in order to represent a range of search volume, and across the spectrum of informational to transactional. For example, one group of keywords are all synonymous the phrase “cheapest flight times” (not-so-subtly lifted from Rand’s Whiteboard Friday):

  • cheapest flight times
  • cheapest time for flights
  • cheapest times to fly
  • cheap times for flights
  • cheap times to fly
  • fly at cheap times
  • time of cheapest flights
  • what time of day are flights cheapest
  • what time of day to fly cheaply
  • when are flights cheapest

I put the sample of 100 keywords through a rank-tracking tool, and extracted the top ten organic results for each keyword.

Then, for each keyword group, I measured two things.

  1. The similarity of each topic’s SERPs, by position.

    • For example, if every keyword within a group has the same page ranking no. 2, that result will score 10. If 9 results are the same and one is different, nine results will get a score of 9, and the other will score 1.
    • This score is then averaged across all 100 (10 results * 10 keywords) results within each topic. The highest possible score (every SERP identical) is 10, the lowest possible (every result different) is 1.
  2. The similarity of each topic’s SERPs, by all pages that rank (irrespective of position).
    • As above, but scoring each keyword’s results by the number of other keywords that contain that result anywhere in the top 10 results. If a result appears in the top 10 for all keywords in a topic group, it scores a 10, even if the results in the other keywords’ SERPs are in different positions.
    • Again, the score is averaged across all results in each topic, with 10 being the highest possible and 1 the lowest.

Results

The full analysis and results can be seen in this Google Sheet.

This chart shows the results of the experiment for the 10 topic groups. The blue bars represent the by position score, averaged across each topic group, and the red bars show the average all pages score.

The most striking thing about this is the wide range of results that can be seen. Topic group D’s keywords are 100% identical if you don’t take ordering into account, whereas group J only has 38% crossover of results between keywords.

We can see from this that targeting individual keywords is definitely not a thing of the past. For most of the topic groups, the pages that rank in the top 10 have little consistency across different wordings of the same concepts. From this we can assume that the primary thing making one page rank where another does not, is matching exact keywords.

Why is there such variation?

If we look into what factors might be affecting the varying similarities between the different topic groups, we could consider the following factors:

  • Searcher intent: Informational (Know) vs Transactional (Do) topics.
  • Topics with high competition levels.

Searcher intent

Although Google’s categorisation of searches into do, know and go can be seen as a false trichotomy, it can still be useful as a simplistic model to classify searcher intent. All of the keyword groups I used can be classed as either informational or transactional.

If we break up our topic groups in this way, we can see the following:

As you can see, there’s no clear difference between the two types. In fact the highest and lowest groups (D and J) are both transactional.

This means that we can’t say — based on this data, at least — that there’s any link between the search intent of a topic and whether you should focus on topics over keywords.

Keyword Difficulty

Another factor that could be correlated with similarity of SERPs is keyword difficulty. As measured by Moz’s keyword difficulty tool, this is a proxy for how strong the sites that rank in a SERP are, based on their Page Authority and Domain Authority.

My hypothesis here is that, for searches where there are a lot of well-established, high-DA sites ranking, there will be less variation between similar keywords. If this is the case, we would expect to see a positive correlation in the data.

This is not borne out by the data. The higher the keyword difficulty is across the keywords in a topic group, the less similarity there is between SERPs within that topic group. This correlation is fairly weak (R2=0.28), so we can’t draw any conclusions from this data.

One other factor that could explain the lack of pattern in this result is that 100 keywords in 10 groups is a fairly small sample size, and is subject to variation in the selection of keywords to go into each group. It is impossible to perfectly control how “close” in definition the keywords in each group are.

Also, it may just be the case that Google simply understands some concepts better than others. This would mean it can see some synonyms as being very closely related, whereas for others it’s still perplexed by the variations, so looks for specific words within the content of each page.

Conclusion

So does this mean that we should or shouldn’t ignore Rand when he tells us to forget about keywords and focus on topics? Somewhat unsatisfyingly, the answer is a strong “maybe.”

While for some search topics there’s a lot of variation based on the exact wording of the keywords, for others we can see that Google understands what users mean when they search and sees variations as equivalent. The key takeaway from this? Both keywords and topics are important.

You should still do keyword research. Keyword research is always going to be essential. But you should also consider the bigger picture, and as more tools that allow you to use natural language processing become available, take advantage of that to understand the overall topics you should write about, too.

It may be a useful exercise to carry out this type of analysis within your own vertical, and see how well Google can tell apart the similar keywords you want to target. You can then use this to inform how exact your targeting should be.

Let me know what you think, and if you have any questions, in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1RzzmbK
via IFTTT

Why You Need to Find All Your NAP Variations Before Building Local Citations – Whiteboard Friday

Posted by Whitespark

[Estimated read time: 6 minutes]

Citation consistency got you down? It’s one of the most important local search ranking factors, but it can be an overwhelming task to find inconsistencies, and it’s often easy to create accidental duplicate listings. In today’s guest Whiteboard Friday, Darren Shaw, founder of Whitespark and recent speaker at MozCon Local, outlines a foolproof way to discover all your NAP variations to prepare for proper citation building.

Why You Need to Find all Your NAP Variations Before Building Local Citations Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey there, Moz fans, I’m Darren Shaw from Whitespark, and I’m here today to talk to you about why you need to find all of your NAP variations before building local citations. It’s an important topic to cover because citation consistency is one of the most important local search ranking factors. In last year’s local search ranking factor study, it was ranked the number two most important factor, second only to just actually having a business in the city you’re trying to rank.

Before I get into it, I want to cover a couple of definitions really quickly. So NAP stands for name, address, and phone number. A citation is basically a mention of your name, address, and phone number somewhere on the Web. Typically you’re going to see that on sites like YellowPages.com or Yelp.com or Superpages, but you can find them on any kind of sites like blogs, newspaper, whatever. Anywhere your name is mentioned, that’s a citation.

I want to illustrate a mistake we see happening all the time in citation building. I’m going to show you an example here.

Bob is a lawyer.

He wants to figure out how he can rank in the local pack.

So he does a little research. He comes across an article on how to rank in local SEO. He reads about citations and how that can help with his local rankings. The article suggests that he uses the local citation finder to research his competitors and find citation opportunities. So he’s done that. He’s got his list, and he’s off to submit his business to all the various directories.

He goes to YellowBiznass.com, and he’s thinking, well, maybe I already have a listing there.

So he searches for his phone number, doesn’t find anything. He thinks he’s all in the clear so he creates a listing.

This is the mistake he’s made. He didn’t realize, he didn’t think about the fact that he already had a citation because he used to use his old cell number for his business. If he had searched for his cell number, he would have found the old listing on the site. So now he’s created this problem where he’s got an old, incorrect listing on the site, and he’s created a duplicate listing. So he’s created a citation consistency problem, and it’s not really helping him to rank well.

We solve this problem in our Whitespark citation services with this four-step process to find all the different NAP variations.

How to find NAP variations

Step one, we ask the business: Tell us about any previous business names that you’ve had, if you’ve changed your name in the past or if you have a corporate account. How about any addresses? Have you moved locations? Do you have any secondary locations? Do you have your business registered at a corporate address? Phone numbers? Any call tracking numbers, toll-free numbers, cell numbers, any past numbers that you’ve used for the business? This is a great way to start. We get a list of all that stuff.

Next, we’ll go to Moz Local and we run a search for the business name and ZIP. This, because Moz Local queries all the primary data aggregators and a number of other important sites in the local search ecosystem, it tends to surface a lot of NAP variations. So we use this to add to our list.

Third, we’ll go to YellowBot and MerchantCircle. These two sites are interesting because they collect data from a number of different sources, but they don’t do a very good job of merging listings as the data comes in. So we end up with a lot of duplicates on the site. It’s a lot of work to clean up that site, but it’s very helpful for this process.

So for this one, you just put in a portion of the business name. Bob’s business is Bob Loblaw’s Law Firm. Instead of just putting the whole thing in, we’ll just put in Loblaw to help surface variations. Here are some variations that might come up. We’ve got Loblaw and Sons LLC, Loblaw’s Law, and Bob Loblaw’s Law Blog.

After we get that, we’re going to go to Google. We’re going to search Google. Now what we’re doing here is we’re trying to find any variations that we didn’t already discover. We already have a number of different names. We have a number of different phone numbers. What we’ll do is we’re going to find more names. If you put in a phone number for the business and you exclude all the names, Google’s going to surface any sites or any pages that mention that phone number without any of these names. That helps you to surface any variations that you weren’t aware of. You do that for each phone number.

Then on the phone number side, if we’re trying to find more phone numbers, we’ll search for the business name and exclude the phone numbers we already know of. By excluding phone numbers we already know of, we might find new names.

The one trouble with this one is it tends to surface a lot of pages where you just mention the business without the phone number, and that’s a common thing. So what I do in this case is I scan the results looking for obvious business directories, if it’s like a Yelp or a Foursquare or anything like that. I’m looking for business directories in these results.

Then we want to see if there are any other addresses we missed. Put in the phone number and exclude the addresses that you’re already aware of. Do that for each phone number.

An important tip here with the addresses is that you don’t want to use the full address. Let’s say for example your address is 5329 Saddleback Road South, Suite 705. Don’t put in the whole thing. You put that in quotes, it’s only going to match pages that are an exact match with that. Just put in that portion that’s going to be common to all the sites, like 5329 Saddleback.

At the end of this process, you should have a very nice, clean list of all of your various names, addresses, and phone numbers for the business.

Now you’re ready to build citations.

All you have to do at this point is just make sure that you check all NAP variations on the site before you submit a listing.

There are two ways to do that. One, you can use the site search feature to search by name and/or phone number. You’re going to run all those, but you can’t always rely on this. Some of the sites have a really crummy search feature. It doesn’t work very well.

We always double check by running a number of Google searches as well. You’ll use the site colon operator in Google. You’ll go site:local.com space and put it in quotes, so phone one, phone two, name one, name two, address one.

You’re going to go through all of the different variations that you’re aware of. At the end of that if you found a listing, you want to claim it and update it. If you didn’t find a listing, then you’re clear to submit. This way, you’ll be creating citations without worrying about creating duplicate listings.

That’s everything. I hope it’s been helpful. I want to say a big thanks to Nyagoslav Zhekov who helped me organize all this information and a big thanks to Moz for having me. If you have any questions, please leave them in the comments. Thanks very much.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1SamzKe
via IFTTT

The Brand as Publisher Masterplan – Reinventing Content Marketing for the Next Decade

Posted by SimonPenson

[Estimated read time: 20 minutes]

Introduction and background

The how-to process

Setting up your team

Free downloads and help guide


Content marketing has an image problem.

Like all potentially transformational opportunities, the world sees something glistening and jumps in head first to claim a piece of the next “goldmine.”

The ensuing digital gold rush that follows often creates a stampede to be first, rather than best, and normally strategic thinking is usurped and instead replaced with a brain-out approach to delivery.

And in the case of the content marketing revolution, the result has been an outpouring of disconnected content that adds little value and serves very few, leaving many with nothing more than a handful of “failed” content campaigns to show for the effort.

It’s something I see every day, and it is incredibly saddening.

Content marketing, you see, is not the answer to those prayers. It’s simply part of a much broader strategic picture that I call the “Brand as Publisher” play; a reset of the core principles behind the content marketing charge.

This piece is designed to explain precisely how you can take the “Brand as Publisher” approach, what it is, and how it can help your business succeed with content.

I’ve even created a unique Brand as Publisher Toolkit for you to download to help in that quest! Click the banner below (or at the bottom of the post) to grab a copy.

Defining the opportunity

So, what exactly is “Brand as Publisher?” Put simply, it’s changing your mindset to put content at the forefront of your business, almost before the products or services that you sell.

As controversial as that may seem, the idea is that you’re able to build an engaged, loyal audience of value for your brand… an audience you can then monetize later.

It’s a long-term play, without doubt, and one that requires consistent investment in both time, resources, and cold hard cash — but it creates “cast-in-granite” value that your competitors will find impossible to steal away from you.

Those who take the time and effort to do it will beat you in the long run, and there will be little you can do about it!

Changing mindsets

Now, before you close your browser, let me add a dose of reality. The suggestion is NOT that you start a magazine or newspaper for a living, but instead take the value from that business model and leverage it for your gain.

In many ways, you must start to…


“Imagine yourself as THE leading consumer magazine for your market.”


The easiest way to do that is to imagine your business as a magazine, as the leading publication for your specialist market and THE place anyone with even the slightest interest in your area of expertise goes to expand their knowledge.

Think about it for a second. In the same way that newspapers and magazines create “value” by sharing quality content on their specialist area and then building an audience around it, so can you.

Where they then monetize that audience by selling ad space, you may do the same by selling related products or services, or capturing leads.

The ability to create what I call “target audiences of value” in this way is how value has always been created. And with those eyeballs now focused online more than ever before, there has never been a better time to capture it.

The challenge is that few understand how to make content work long-term. While many brands (and agencies, for that matter) make a song and dance about delivering amazing campaigns, there is a very real need to get back to the basics and build, not just a campaign plan, but a longer-term brand content plan.

This excellent piece for Adage does a great job of arguing why we really do now need to focus on “proper content strategy” and not just on delivering content, particularly from agencies.

Recreating it online

This post is designed to share the secrets honed by the magazine industry over the last six decades; to share the principles that will maximize your chance of success with a content-led strategy.

To make that more digestible, the approach is broken down into a series of integral “pillars,” the first of which focuses on audience insight.

Pillar One: Audience understanding

This process starts and ends with people, with a pure understanding of who that already is and, critically, who you want to consume your content.

Traditionally, the process of gathering insight would have been carried out by running reader focus groups, an often fascinating series of meetings with existing readers and those who currently don’t purchase but are very much “in the market.”

It’s a process I ran as editor of a British specialist car magazine called Max Power, visiting six different locations across the country to meet between four and twelve existing and potential readers.

Those candidates were selected by our own subscriptions team and from the wider industry events we attended on a regular basis in order to “stay close to the audience.”

A budget then allowed us to work with a professional research agency to run structured Q&A sessions with them. In reality, however, you can do the same meeting at a bar, providing you prepare the right questions beforehand.

Every business will have different insight needs. One way of determining which questions to ask is to first capture the key outputs you wish to come away with:


1. Who is currently buying your product or service?

2. Why are other people not buying it?

3. What general trends are affecting these people’s lives at the moment?

4. Where would people buy your product or service from?

5. When, where, and how would they use or consume it?

6. Why would they buy it? What need do they want to satisfy?

7. Who is your real competition?

8. What image do people have of your brand vs. your competitors?

9. What do they think about the different aspects of your product or service (name, packaging, features, advertising, pricing…)?

10. What improvements could be made to your product or service to meet people’s needs even better?

11. What is the single most important benefit your brand should be seen to be offering?

12. How can you best communicate that benefit to the people you’re interested in attracting?

13. What is the right price to charge?

14. What other new products or services could your brand offer people?


Questions can then be crafted to capture that information easily, and you’ll go to those research meetings armed and ready.

Digital insight

That real-world data can be further improved with the addition of digital insight. I have written several times about my process for extracting useful customer information from Facebook and also how you can use paid-for tools such as Global Web Index to form an understanding of how your audience interacts with your brand and wider market.

Combing both the qualitative information you collect in the focus group meetings and the quantitative data you can access digitally will allow you to create data-informed personas for your brand as publisher strategy.

Pillar Two: Personas

Having a clear view on who you wish to target helps steer and shape everything you do editorially. If I rewind back to those Max Power days, we went as far as painting those personas clearly on meeting room walls and in the main office so we were constantly reminded of whom we were there to work for.

How you pull personas together is the subject of a lengthy post in its own right, but there are guides like these will help you do just that:

  • Personas: The Art and Science of Understanding the Person Behind the Visit
  • Keyword-Driven Personas

The point is to put a human face on the data you have bundled into audience segments. By doing so, it enables not just the team pulling the information together to understand who they are and how their needs differ, but also the wider business.

It is also a very good idea, as I’ve written previously, to try to align those personas to celebrities. This really lifts each persona into a living, breathing character that everyone can understand in much greater detail. We all know, for instance, how Beyoncé talks, holds herself, and may be portrayed attitudinally.

Pillar Three: Editorial mission statement

With the audience piece complete, the next stage is to then create your “Editorial/Content Mission Statement”: the crystallisation of your content value and objectives.

Any good content team will have this burnt into their retinas, such is the importance of having a statement that outlines what you stand for. This is your guiding light when creating content, focusing on who your audience is and how you’ll serve them. It should be the measuring stick by which you evaluate all of your content.

A great example of this done well can be found hidden within the wider brand documentation for a brand like Sports Illustrated:


“Sports Illustrated covers the people, passions and issues of numerous sports with the journalistic integrity that has made it the conscience of all sport. It is surprising, engaging, and informative, and always with a point of view that puts its readers ‘in the game.’”


It’s a good example for several reasons because it captures all the key focal points for the brand succinctly. Below we can see how they have managed to cover the key pillars in their editorial strategy:

Our positioning on Max Power was also captured in a similar mission statement, succinctly defined as:

“The definitive guide to arsing [sic] around in cars.”

Editorially, we ensured we injected “attitude,” “fun,” and “entertainment” into every issue, while also maintaining our stance as “experts” and “trend setters” in what was a fast-moving youth market.

Pillar Four: Content flow

We knew that by staying close to our audience, we would continue to lead the market due to our reach. But we also knew that as we covered a wider audience of car enthusiasts, we needed to ensure that our publication was reflective of the audience/readership.

This meant thinking very hard about the “flow” of the magazine; what mix of content we included and how it was delivered over time.

Content flow is a process I have written about previously here, but it’s worth covering again, such is its importance. Getting it right is the difference between campaign delivery and truly connected content strategy.

The basis of flow is having the right mix of content to deliver from page-to-page, or day-to-day in the case of digital.

The best way of doing this is via a process known as “flatplanning,” a print publishing technique that also lends itself well to digital planning.

Pillar Five: Flatplanning and regulars (reinventing the wheel)

So, how does flatplanning work?

The concept is a very simple one for print publications: you recreate the pages you wish to fill with lovely content schematically in a document that looks a little like the one below.

You’ll see that I have started populating this to give you an idea of how it worked in the Max Power example we’ve been using.

Above, you’ll see how each element, or content idea, has been added to the plan. Doing it this way it makes it very easy to visualize how the strategy ebbs and flows in terms of the variation of pace afforded by the different types of content you include.

Take, for instance, the first couple of pages here. You can quickly see that we kick-start with some shorter-form, faster-paced news on page 1–12 before we then change pace and move into a four-page longer-form piece on pages 13–16 before going back to a two-page piece.

Obviously, in the print world the ONLY variation you can play with is length of article and style of writing, but when it comes to digital the opportunities are endless.

Flow

In the online world you have a plethora of media types to play with to add extra zing to your content strategy. The key to getting the “flow” correct is to use this flatplan technique, with the pages being hours, days, weeks, or whatever other measurement of time is relevant for your plan.

We often refer to the brilliant Smart Insights content matrix as part of this content type planning process. You can see below that it includes all of the key content types and adds insight into which part of the customer purchase and intent cycle they sit.

I’ve created a new resource to help further with this process, based on the same principles. The Content Flow Matrix helps you understand which content types to use based not only on where they may sit within the purchase funnel (upright axis) but also the relative “size” of the content.

By choosing a mix of content types AND a mix of content “sizes,” you end up with the right mix of variation to ensure your content audience remains engaged and that they come back for more.

Pillar Six: Front cover insight

But while variation is a great thing, it’s also very important to make clear what the cornerstones of that strategy are, and to consistently and clearly reinforce and deliver that for your audience.

The way this works in print is to utilize the cover “sells” to deliver consistent messaging.

One of the very best exponent of this is Men’s Health magazine, a media brand that very much understands its readership and where and how it can add value.

Below you can see a randomly selected front cover highlighting what I call the “Editorial Pillars” of the brand — the cornerstones of its strategy.

Every single month, the cover will feature content that offers to help you improve your mind, body, or sexual performance:

Digital content strategy requires the same focus. Part of your overall strategic planning process should include a session to establish what those pillars are for your brand.

Below, you can see how a template front cover may look, complete with spaces for your editorial pillar planning. I have also included a copy of this in the free Brand as Publisher Toolkit download bundle created specifically for this piece to help you build your own strategy.

What might that look like for my agency, Zazzle Media? Here’s a fun example created by our designers to give you an idea of how it might be pulled together:

Getting it right will mean greater engagement, more return visitors, and more sharing of and linking to your content.

Marketing and incentivising purchase

So, with a clear proposition and great content delivered with variation and clear messaging, you’re ready to roll, right?

Almost, but not quite. Often the key difference between a magazine being successful and just being “OK” was the quality of its marketing strategy.

If anything, this is even truer in the digital sphere. Thinking about how you reach your audience is what this blog is all about.

The challenge, digitally, is that while access to your audience is faster and easier, it means that the barriers to entry that protected traditional media for so long are no longer there. And that means competition, and lots of it.

In print, the only truly effective ways of growing market share was to improve distribution (be in more stores), optimize your position on the newsstand (be more visible), or invest in gifting (giving away free stuff on the cover).

These strategies translate nicely to digital in the following ways:

  • Ensure you have a strategy for all relevant channels (social, search, influencer channels) to maximize reach.

  • Optimise all channels to maximize effectiveness (SEO is especially important here)

  • Incentivize. This will look different for all businesses; for example, in ecommerce this may be money-off codes.

One final area of investment for publisher brands is live events. This is, again, a cornerstone of a brilliant brand as publisher play.

For those dealing with specialist markets (and that is exactly what we all do online), it has always been absolutely critical to stay close to the audience. One of the very best ways of doing that is to create, and run, semi-regular live events.

What they look like is completely dependent upon what market you’re in, but if you truly understand your customers, you’ll almost always be able to add some experimental value.

For some, it may not be possible to do this in person. Where this is the case, regular Hangouts and/or webinars can fill the void.

Max Power was always famous for running regular meets throughout the UK for people to bring (and show off) their cars. It was a forum that kept us connected to the loves and hates of the market, and allowed us to establish strong relationships with the key influencers amongst them.

Events can be seen as the icing on the cake to many, but in reality they are one of the most important slices of the marketing cake.

Pillar Seven: The long tail

Another underestimated area of opportunity can be found within your regular content, the pieces you put out every day and that serve to stick together your bigger-bang campaign content.

In magazines, these pieces help create variation of pace as you turn the page. In the digital world, they can do much more.

Designing your long tail strategy in a way that takes advantage of long tail search opportunity is something I have covered as a standalone subject here at Moz previously, and I’d urge you to read the post to get the most out of your idea planning.

The added bonus now is also taking advantage of Google Answer Boxes.

By designing regular content to answer key questions that your audience is asking (I use Answer the Public and a keyword tool like Keyword Studio to help me understand this), you’re not only adding value to regular visitors’ lives — you’re also creating the opportunity to jump to the top of the SERPs.

Claiming those boxes requires real focus on article structure and good use of headlines, as this amazing study by Razvan at Cognitive SEO explains. If you achieve it, in our experience you can expect to see a 15% increase in traffic from that keyword, versus even being first in the normal SERPs.

Outside of the “content-for-long-tail-search” opportunity, regular content also serves to provide interaction opportunity. Using those “regular” slots to run polls, quizzes, more brand-led pieces and so on will enable you to not just provide variation but also improve brand understanding, resonance, and reach.

Pillar Eight: “Big Bang” content

For many, the campaign content end of the spectrum is where most content strategists concentrate. This is a mistake. While Big Bang pieces can undoubtedly provide greater reach and attract more links, they alone do not constitute a strategy.

That said, they can certainly provide value — and like a magazine full of short-form content only, without them, you lose readers quickly.

The “features” in a magazine — those articles that span four+ pages — are the print Big Bang equivalent. They often fit within those brand as publisher pillars we discussed earlier.

For Max Power, these would often take the form of a car road test, road trip, or interview — but in digital, the world is your oyster.

For instance, we’ve recently produced content campaigns as diverse as a vegetable cookbook, to a supermarket shopping challenge, to the Classroom of the Future, to give you a taste of what that means.

Content types that lend themselves to Big Bang campaigns include:

  • Tools

  • Games

  • Data visualizations

  • Guides

  • Surveys and reports

  • Video

The key, once again, is ensuring there’s variation, even in your Big Bang output. So many brands will find a hit with one type and then stick with it, but that’s missing the point.

As with every part of your strategy, variation will always win. That’s how you stand out from the crowd in the long run.

Pillar Nine: Team structure and resources

Creating this variation is not an easy task. It requires a greater focus than ever on available skill sets.

You may think that what’s needed now from a team perspective is much more demanding than it was before, but that view isn’t necessarily correct.

To give you a view on what it took to pull together an issue of Max Power, we employed the following. I also explain, briefly, their role within the whole:

  • Editor – Responsible for the overall positioning and editorial strategy. Takes a longer term view to issue planning and liaises with commercial and publishing teams to maximize sales opportunities and sales. Works closely with all.
  • Publisher – Commercial-focused P&L owner responsible for distribution deals, production costs, and sales (the number of magazines sold AND ad revenue from it).
  • Deputy editor – Day-to-day ownership of the flatplan. Ensures content is delivered on time and to standard. The editor’s right-hand man/woman.
  • Production editor – Responsible for ensuring everything is produced on time. Liaises with the printers to ensure production standards are upheld.
  • Art editor – Leads the design team and is responsible for upholding design rules and the adoption of brand values throughout.
  • Designers – Layout and design all pages, and will artistically direct shoots to ensure that the design vision for individual features is carried through.
  • News/Features/Section editors – Lead a mini-team of specialist writers and are responsible for their output and the quality of their sections.
  • Writers – On-the-ground journalists who are out and about more than they’re in-office working on the individual articles and features.
  • Photographer – More often has a focus on photos, but may also have video skills.
  • Web team – In the early days of the net, this team ran separately to the “main” print team and often reconstituted print content for the web, ran communities, etc.
  • Advertising team – Responsible for selling all advertising space in the magazine (a key way of monetizing the audience).
  • Production team – Produce the adverts that the advertising team sells and supplies them to the designer team.

As you can clearly see, the cost of a great editorial product has always been high — that will never change — but the value it creates will outweigh the cost if you get the strategy right.

The big question, of course, is what should the right digital version of this team look like?

This is something I have spent a great deal of time looking at in my current role; here’s a view on what a small, medium, and large business could base a setup on. Obviously this looks different for everyone, as different markets demand different areas of focus, but this can be a start point for discussion:

Small business

In this scenario, we’re ideally looking for multidiscipline people. In an ideal world, your journalist will be able to write and PR their written work, leaving you with the possibility of also including someone to focus on paid promotion across search, social, and native.

As with all of these example team structures, the MD/CEO of the business should own the brand as publisher plan, bringing it to the very centre of focus for the business. In larger businesses, that may ultimately be taken on by the CMO, but in any business of hundreds of people or less this needs to have priority focus.

In a small team the focus has to start with owned and earned media, hence the balance of people here. With a writer and designer you can create lots of different types of content, while the PR person focuses on building key relationships and leveraging those connected audiences.

Medium enterprise

In a slightly larger organization with more budget to play with, things start to get much more interesting as roles become more specialized.

In this model (and read each specialty as being scalable with multiple people in each of those teams) we can create more variation. Video and data start to creep in, allowing you to not only create a wider range of content, but also understand who your audience is, where they are online, and what they consume right now.

Interestingly, we find that those who have traditionally sat in SEO roles make for very good data analysts in helping to forge a data-driven strategy, while their abilities in ensuring platforms are still “fit for purpose” means they can fulfill a dual and extremely valuable role.

We then also have the ability to split out PR and blogger relations. That way, there’s focus on both the niche and the big traffic media brands within the distribution plan.

At this level, it’s also critical to have some specialist paid media focus to ensure that the content distribution plan includes a cohesive paid media element.

Large brand

For large-scale enterprises, the sky is the limit! We can go much further in bringing in further data specialists and also how the wider CRM play may come in to include specialists dedicated to best using the whole Inbound Marketing Suite.

We also add in multilingual capability, especially important to international brands, as well as other specialties to give more focus to the overall strategy, ensuring it’s scalable. The sky is the limit here.

Help is here!

We’ve covered a great deal of ground in this post on a subject matter that asks wider questions of all brands and businesses. To help you on a more practical level to work through it, we’ve created an all-encompassing Brand as Publisher Toolkit. In it, you’ll find:

  • Flatplan template
  • Magazine cover template
  • Content campaign planner
  • Editorial calendar
  • Persona template
  • A copy of our Content Flow Matrix
  • Content Style Planning Guide

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1q3ogTi
via IFTTT

How to Create Content That Earns Engagement, Trust, and Loyalty for Your Brand

Posted by ronell-smith

[Estimated read time: 17 minutes]

A couple of years back, I received a call from the CMO of a small but popular and growing startup about taking on the brand as a content strategist. While I was initially lukewarm to the idea, they were adamant about working together, feeling that I “could help them reach their goals.”

Before hanging up the phone, I asked him to email me the main priority for the onsite content:

“Engaging content (e.g., shares, likes, tweets, etc.),” she wrote.

I thought, I can do engaging.

I reasoned I’d stick with how-to information content, in-depth evergreen content, and maybe a few interviews. In the online marketing vertical, these are what I call “can’t miss elements” for brands looking to create onsite engagement.

But not long after I started working with the brand, I saw some problems that should have been red flags from the beginning:

  • The type of content they wanted for the blog didn’t garner traffic
  • The type of content that did garner traffic didn’t garner engagement
  • When I talked to the CMO, her words were equally confusing: “Conversions are up, but we need to see engagement improve to continue the relationship.”

I was confused.

Is there EVER a scenario where increased conversions was a negative?

Shortly thereafter, the relationship dissolved. The culprit wasn’t a lack of engaging content, though.

Engagement, alone, is a poor choice for a goal

This likely sounds familiar to folks reading this post. Maybe someone says, “We have a shiny new website, so now we need to blog.”

The next question is “Who’s going to blog?”

Then, typically, the question after that is “What do we blog about?”

Someone always, and I do mean always, says, “About what we do. You know… stuff that will get folks talking about our brand.”

The next question and answer dooms us: “What’s the goal?”

  • One blog/week
  • To drive people to our website
  • To increase conversions

Inevitably, the main goal for the content itself, though, is engagement.

The biggest problem brands have in the move to content marketing is creating engaging content.

Why do you think that is?

  • Because it’s hard?
  • Because they don’t have writers who can produce it?
  • Because when they do produce it, folks still don’t engage with it?
  • Because they’re marketing to the wrong audience?

Nope!

Creating engaging content is a nice-to-have, first-step goal. But as the client I talked about earlier found out, engagement alone isn’t going to move your brand forward in what is now a sea of content.

Engagement is a goal; it shouldn’t be the goal.

First, engagement simply means people noticed your content and interacted with it in some, typically small, way. That could mean a social share, leaving a comment, sharing a link, etc. And for those of us just starting on the content marketing journey, that’s nothing to sneeze at.

Where the problem comes is when we use engagement as an all-important Key Performance Indicator (KPI) of how your brand’s content is performing.

I think Avinash Kaushik, Google’s digital marketing evangelist, says about all there is to say about engagement with this quote, taken from his blog:

“Even as creating engaging experiences on the web is mandatory, the metric called Engagement is simply an excuse for an unwillingness to sit down and identify why a site exists. An excuse for an unwillingness to identify real metrics that measure if your web presence is productive. An excuse for taking a short cut…”

He goes further, saying the only people who use engagement as a metric are those who are too lazy to discern the real reason for being for their website.

They refuse to ask “Why does it exist?”

So they assign value to something that is all but impossible to measure in a tangible way.

My experience mirrors those comments. Engagement is an easy, feel-good metric used by brands who lack clear purpose for their content marketing.

My core problem with using engagement as a metric of significance is it’s hard to measure, next to impossible to sustain and, worst of all, easy to copy.

In five simple steps, competitors can kill your engagement strategy:

  1. Visit your website and see which content is doing well: See the Facebook, Twitter ands Google Plus number, and that gets them to thinking…
  2. They go to Google, do a site:search and see what your top-performing content is. Then they tell their copywriting staff to take this idea and expound upon it — more details, richer graphics, etc.
  3. Then they use a tool such Open Site Explorer to view your site’s backlinks to see who’s linking to them and what content is getting the most links.
  4. They’ll reach out to those same brands and say, “We see you’re linking to this content. We created a similar post that has even more details.” They’re likely to add the competitor’s link, but they’re just as likely to unlink to your content.
  5. Your stellar content piece is likely to take a tumble in the SERPs and your site will miss out on traffic.

All because you chased the wrong goal.

I’ll add a huge “however” here: If you’re just starting out, OR if all you really truly care about is creating some potentially engaging content, you can do exactly what we outlined regarding the competition. You find a popular brand in your vertical and copy the content they’re creating, only you make it better: better written, better text, and you commit to outreach. I can tell you that of all the companies I’ve worked with and for — from mom and pop cupcake shops to, moving companies, fitness brands, apparel manufacturers and software companies — this is where the content creation process begins and, sadly, sometimes ends. So copy it. Use it. At least until you get better, see better, and know better what the audience wants.

But never hang your hat singularly on engagement.

What comes easily is just as easily taken.

Brand trust is essential for content marketing success

If engagement is a blind date, trust is going steady. It has to be in place before things get too serious.

In the strictest sense, trust is about how prospects and customers view your brand, how they view the people who represent your brand, what you stand for and how you make them feel.

(Image source)

While asking for prospects to trust your brand this much is definitely pushing it, brand trust is an imperative in today’s online marketplace.

When trust is in place, people come to see your brand as not simply a reliable option, but the reliable option; they feel good about association with it; and, most importantly, they seek out those interactions.

To get there, people need to see your brand and brand representative in lots of places, online and offline, to develop familiarity and form a positive association with the brand. (I call this positive ubiquity.)

That’s why making too big of a deal about onsite content is a mistake. It’s important. But, let’s be honest, if there are only three people reading your blog, your impact is going to be very limited. Wouldn’t you agree?

In addition to writing posts and sharing your brand’s content, you should also be sharing valuable content from other non-competing brands; engaging in meaningful online conversations surround your vertical; interviewing influencers in your space; and creating a presence that moves seamlessly between online and offline, social and content, human to human.

The fact of the matter, though, is that people respond best to people. Not words or images or fancy design. And as reluctant as you might be to have public faces for your brand, you need it to make your content marketing efforts work.

People are what lead prospects to build an affinity, not simply an association, with your brand. It’s akin to going from an encounter to being noticed.

Think…

  • Apple and Steve Jobs
  • All State Insurance and the Mayhem man
  • Blendtec and its zany CEO (shown below)

(Image source)

Make this work for your brand.

Why not highlight SMEs inside the company?

Instead of simply forcing everyone to blog, find out what individual team members are good at and have a passion for, then allow them to express their creativity for the brand in their own way.

  • Maybe another team member is passionate about radio. Why not have her do a podcast for the site, but also share it via iTunes, SoundCloud, or wherever else it makes sense to share it?
  • Every office has the resident know-it-all. Why not create a Twitter handle and associated hashtag for this person, and allow them to spend 30 minutes a day online answering questions for the brand?
  • Maybe you find that someone hates writing blogs, but is interested in theatre and would love doing vblogs for the site as well as posting them on YouTube or Wistia.

(Image source)

And while you’re building that brand affinity, people who aren’t even in the market for your product or service will take note, realizing that your brand cares.

You aren’t out for simply earning a dollar. You’re really helping people, even when those people aren’t likely to buy anything from you.

I know what you’re thinking: “Ronell, who has time or resources for that?”

My answer is, “You don’t have to do any of this. Really, you don’t.”

But I’ll add that if you do at least some of this, consistently, you will be more successful than you likely assume, in large part because most of the competition is unwilling to do it.

Whenever I hear people talking about how difficult it is to find success in content marketing, it reminds me of a quote from one of my favorite strength coaches.

One of his clients said, “Squatting hurts my knees.” After witnessing a demonstration of what the client called a squat, the coach said, “Squats don’t hurt your knees. What you’re doing and calling squats hurts your knees.”

Content marketers are a lot like this, right? We throw ideas at the wall, then call what sticks a success.

We’re better than this.

The path to content marketing success leads to loyalty

Typically, when we set out on this content marketing journey, we, as a team, set these arbitrary goals: We need X number of tweets, X number of Likes and shares on Facebook, Google Plus and so on.

A better way to do it was exposed by Buzzfeed.

Yes, that Buzzfeed.

The site might post an inordinate amount of dumb stuff, but has an amazing data science team. That team studied how content is shared across the web and uncovered some interesting findings.

Leading to what we now know as P.O.U.N.D.: the Process of Optimizing and Understanding Network Diffusion.

We tend to think that a Facebook Like leads to a Facebook Share, which leads to more Facebook Likes and Shares. And a tweet leads to more tweets, etc., etc., for the other social networks.

What they found is network diffusion doesn’t happen in a linear fashion.

Basically, people jump between social networks and links and back again. For example, a Facebook Like might lead to a Facebook Share that leads to a Twitter Share that bounces to a website via a link then back to Facebook as a Like or Share.

This petri dish-looking thing below is really is a graphic depiction of network diffusion, where the dark blue areas are Facebook, the light blue areas are Twitter and the white areas are links.

What Buzzfeed found is that they get links as a byproduct of network diffusion. They don’t need to optimize for links or make link building a focus. The lesson for them, as it should be for us, is that the more they optimize for network diffusion, the more links they’re going to see.

This is not just fascinating; it’s instructive.

Instead of concerning ourselves with link building and outreach and hoping we get links, if we simply optimize our efforts at creating and sharing content, links naturally occur.

Previously, the thinking was to create a piece of content, then build links to it.

But now, with what we know about network diffusion, we’re going to focus on publishing all of our content to the right streams and to the right audience. We’re optimizing for which social streams move the fastest for the specific topic.

As a content marketer, this information should excite you, especially if your team is ready to commit to the right, and best, goal, which is content loyalty.

If that’s not your goal, scrap your goal and adopt this one.

Content loyalty means you aren’t having to work so hard for your content. Your content is working for you.

  • Folks are avid fans, actively seeking out each and every piece of content you create.
  • Instead of you having to carry the load with sharing and promotion, these fans are sharing and promoting like crazy.
  • Instead of worrying about what content to create, your fans, followers, prospects and customers are actively involved helping you via comments on the blog, questions and responses on social media, interactions with the help desk, and sundry other touch points whereby they interact with the brand.

“The shortest path to break through the noise and create a sustainable content strategy is to create content loyalty,” says Moz’s Matthew J. Brown, who is chief of product strategy and design.

It’s difficult but doable.

Parse.ly, an audience insight platform for digital publishers, found that 2.6 days is the median pageview peak for any single piece of content. Pageviews basically fall off a cliff shortly thereafter.

If you get 20% of your traffic from social, things are a little bit better: 3.2 days

But by and large your window is two to three days.

But the biggest takeaway from their research, which looked at hundreds of sites and billions of pageviews, showed that the average site sees only 11 percent of its visitors returning at least once in a 30-day period.

You heard right: 11%.

That number might sound low, and it is. But it highlights an opportunity.

If you can get that number up to 20%, you’re doing 2X better than the competition.

So how do you get there?

A content marketing playbook

Vulture.com conducted a study with Chartbeat to find what on-page content attributes led to content loyalty. They wanted to figure out what led readers to return to their site.

They found that if they could get their readers to return to the first page of their site 5 times, the readers would be what they term “loyal visitors” of their site, returning frequently to consume information.

In other words, five days was their core loyalty metric, and the primary starting place for the brand’s content efforts.

They looked at factors ranging from text length to images and the number of ads on the page, and what they found was surprising and illuminating: For them, the key was the amount of text above the fold.

That is, loyal readers expected to consume a certain amount of content above-the-fold. (Click the link above for the details, which are quite interesting.)

Armed with this information, Vulture.com could focus on a targeted attribute that led to their 5X, loyal, readers.

Nothing is stopping you from doing the same.

Making content loyalty work for your brand

Your first step toward content loyalty, is to define your goal post (e.g., visits per an allotted amount of time), then optimize for the attributes that lead to that goal.

For your brand, it might be content length or number of ads or GIFs or videos.

The key is to dial in those attributes that are specific to your site, then continue to optimize for them.

You likely have some inkling of what content types help earn loyalty in your vertical, based on popularity and such. Same thing for content types. We know that for many industries, blogs, videos, infographics, and the like are the most shared and most linked to types of content.

Your brand can do the same, provided you have the heart and the patience to do so.

One of the reasons brands are struggling with content marketing is they aren’t giving it enough time. Create a program, set a plan, and let it run.

It’s not a 90 day thing.

“The sheer majority of brands will continue to crash and burn with their content creation and distribution efforts. Simply put, most brands resist telling a truly differentiated story, and even those that do tell one aren’t consistent or patient enough to build loyal audiences over time,” says Content Marketing Institute founder Joe Pulizzi.

If you’re willing to put in the work, though, you can have success.

The natural starting place is a content audit.

I know many of you cringe upon seeing that word. But you have to start somewhere, and the content audit is the best somewhere.

Besides, before you get started producing content, you need to know what you have and how well it’s performing.

If, like me, you’ve done content audits, you know they can be a time-consuming chore, especially when done from scratch.

Luckily, you don’t have to start from scratch.

Using the template found in Mike King’s deck from Authority Rainmaker, you can get an excellent snapshot of the strongest-performing content on your site. Then you simply aggregate that data to see what’s resonating with your readers, what’s creating that network diffusion for your brand.

For example, you can find the most shares for various types of content, which can help you better discern what types of content you should be creating and sharing more of.

Once you have your content audit in hand, the next step you want to take, before execution on your new content strategy, is to calculate your ROI. This Content Marketing ROI Calculator from Siege Media allows you to plug in the costs associated with creation, including how many links and shares and loyal visitors, which makes it easier to make the case for your boss or your clients. This is a must-have when you’re trying to not only get buy-in but also get the time you need to execute your plan.

If your brand is like many of those I’ve worked with in the past, meaning you don’t have a wide base of content from which to pull a great deal of data from during the audit, I suggest using tool like BuzzSumo, which is a newcomer that has become very popular very fast in content marketing circles.

And for good reason.

It can help you get up and running really fast, and you can learn a great deal about how your content is performing along the way.

BuzzSumo allows you to view the social landscape across myriad topics for the entirety of your competitive landscape.

So, by the time you get started, you can have a complete list of targets and categories to optimize for, even if you don’t have a strong content inventory.

One of the coolest parts about working with Moz — aside from the Roger notepads and pens — is the great people who are always designing and creating tools for us to use, then share with the audience.

For a while now, we’ve been privileged to play with something called One Metric.

Created by our audience and data teams, it allows us to weight social sharing, traffic and links and on-page attention, and reader engagement to create a more organic content score that ensures we’re looking at the entire picture.

Earlier this year, Moz released Moz Content, which is basically One Metric plus 10 and times one million.

With Moz Content, you can crawl your site, then integrate the various bits of information, including content types, your author performance, your social sharing, your links, etc. Even better, you can create, track, and save multiple content audits, making it possible to see how well your content is doing over time, and with ease.

The goal is to make that first step when performing a content audit much easier.

Even better, using the newly created Moz Context API, you’re able to extract the most relevant topics for your site. It can tell you what topics and what keywords are the most relevant for your site and across the web.

This allows you to create a topic inventory for your site.

Let’s say, based on performance, visitors are engaging with these content types and topics most on your site. That way you don’t have to guess about what content to create.

You can then focus on optimizing for creating and sharing the right content in the right places for the right audience, instead of blindly creating content with the hope that it performs optimally.

Maybe my favorite feature, and the one that I can see many brands using most to position themselves favorably against the competition, is the Content Search feature. It allows you to see topics — -your topics — across the web, enabling you to harness information on what’s getting the most shares, what’s gaining social traction, what’s resonating with your audience.

With this view, you’re getting a bird’s-eye view across the web, so you can see what’s working for the competition, what they’re having success with and what, maybe, you should consider trying.

Full disclosure: Since Moz Content is new, I still rely on BuzzSumo for getting a quick, easy, and clean snapshot at the topical level, then use Moz Content to get a deeper look at the content landscape I’m hoping to track, whether for myself or for a client or prospect. And because both platforms offer a level of free service, I’d suggest using them in tandem, especially at first, to get a feel for which has the features better suited for your needs.

Take your content marketing to the next level

Hopefully, you have a better sense of how to be successful, in addition to having a more in-depth understanding of what it takes to attain long-term success in content marketing. The overall goal for this post, however, was to make it clear that, with regard to the content you create, share and promote, loyalty is THE goal, not a goal.

Remember, content is meant to support your marketing efforts; it should not define them. If the content you create can draw readers to your site consistently, your team can then set about ensuring that the various messaging needed to call attention to or sell additional products are in place, even as you further optimize the content to increase views and viewers.

By making content loyalty your goal, you make it palatable that more of your brand’s goals are attainable.

What are your thoughts? Do you think loyalty is the right goal for your content?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/25lpY2K
via IFTTT