Which one is the best CMS for SEO?

Posted by soaryourseo

My passion for web development has begun over 2 years ago when I first discovered WordPress and it continued to increase ever since. But, I always asked myself: are there any better content management systems (CMS) for me? I tried Joomla and noticed it has a similar working system as WordPress but I found WordPress to be more colorful, intriguing and easier to navigate. Either way, what matters for me is search engine presence and found out that WordPress is better. Not by a lot, but, in the end of the day, I had a more flexible time when it comes to search engine optimization (SEO) and building a beautiful website. I am proud of my website, Soaring SEO. And no, I am not trying to boast. This is genuinely a work from the heart. Just because it is WordPress, it does not mean a website lacks detail or resources. Matter of fact, I find WordPress great for all businesses. I can optimize search engine information to a high degree and I scan it through plugins inside the CMS internally and outside using SEO Powersuite.

I first begun by building websites through Wix and Webs and found them weak when it comes to SEO. Especially, Wix. I mean it is a great database to display portfolios to friends and family whenever there is an internet connection. But, when it comes to serious traffic generation, it fell through. When you look into your Wix website after, let’s say a few months, you will see that it: overlaps content, erases/ deletes content, and has weak SEO options. Webs is just another easy drag and drop platform that is alright at the most. As as a result, I have come to the conclusion that WordPress is the best for businesses and SEO. But, why WordPress? And what type of businesses is it good for?

WordPress is a user friendly CMS that is time consuming. To set up a basic WordPress website takes approximately 1 week of 8 hours a day (40 hours) for it look presentable. Now, I am referring to a very nice, fully functional and up-to-date website that is appealing to my standards. And once it is ready there is: up-keep, blogging, advertising and using many other programs to make your site #1 on Google. Which programs? SEO Powersuite, Google Trends, Google Webmaster Tools, Screaming Frog, and researching many other ways to make your online campaign noticed. As you may observe, SEO is very time consuming and many businesses want to do what they do best and that is offer their discipline or service.

Which businesses are good for WordPress? Well just about any because WordPress is very customization friendly. If I had to recommend it, it is good for: product selling, services, shops, and small to mid-sized businesses. Larger businesses such as banks and universities will have to customize their WordPress website via coding and programming to make it a more large-scale database. In sum, WordPress is ideal for SEO and business. Wix and Webs are great just for having content online just for your friends, family, and interest. Why would you hire somebody to do SEO and Website Development? Well, it is very time consuming and involves a lot of content creation and writing.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1PZxXv7


How to Write for the Web—a New Approach for Increased Engagement – Whiteboard Friday

Posted by Dan-Petrovic

We tend to put a lot of effort into writing great content these days. But what’s the point of all that hard work if hardly anybody actually reads it through to the end?

In this week’s Whiteboard Friday, Dan Petrovic illustrates a new approach to writing for the web to increase reader engagement, and offers some tools and tips to help along the way.

How to Write for the Web - a New Approach for Increased Engagement Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

G’day, Moz fans, Dan Petrovic from DEJAN here. Today we’re talking about how to write for the web.

How much of an article will people actually read?

This year we did an interesting study involving 500 people. We asked them how do they read online. We found that the amount of people who actually read everything word-for-word is 16%. Amazingly, this is exactly the same statistic, the same percentage that Nielsen came up with in 1997. It’s been nearly two decades, and we still haven’t learned how to write for the Web.

I don’t know about you guys, but I find this to be a huge opportunity, something we can do with our blogs and with our content to change and improve how we write in order to provide better user experience and better performance for our content. Essentially, what happens is four out of five people that visit your page will not actually read everything you wrote. The question you have to ask yourself is: Why am I even writing if people are not reading?

I went a little bit further with my study, and I asked those same people: Why is it that you don’t read? How is it that there are such low numbers for the people who actually read? The answer was, “Well, I just skip stuff.” “I don’t have time for reading.” “I mainly scan,” or, “I read everything.” That was 80 out of 500 people. The rest said, “I just read the headline and move on,” which was amazing to hear.

Further study showed that people are after quick answers. They don’t want to be on a page too long. They sometimes lose interest halfway through reading the piece of content. They find the bad design to be a deterrent. They find the subject matter to be too complex or poorly written. Sometimes they feel that the writing lacks credibility and trust.

I thought, okay, there’s a bunch of people who don’t like to read a lot, and there’s a bunch of people who do like to read a lot. How do I write for the web to satisfy both ends?

Here was my dilemma. If I write less, the effort for reading my content is very low. It satisfies a lot of people, but it doesn’t provide the depth of content that some people expect and it doesn’t allow me to go into storytelling. Storytelling is very powerful, often. If I write more, the effort will be very high. Some people will be very satisfied, but a lot of people will just bounce off. It’ll provide the depth of content and enable storytelling.

Actually, I ended up finding out something I didn’t know about, which was how journalists write. This is a very old practice called “inverted pyramid.”

The rules are, you start off with a primary piece of information. You give answers straight up. Right after that you go into the secondary, supporting information that elaborates on any claims made in the first two paragraphs. Right after that we go into the deep content.

I thought about this, and I realized why this was written in such a way: because people used to read printed stuff, newspapers. They would go read the most important thing, and if they drop off at this point, it’s not so bad because they know actually what happened in the first paragraph. The deep content is for those who have time.

But guess what? We write for the web now. So what happens is we have all this technology to change things and to embed things. We don’t really have to wait for our users to go all the way to the bottom to read deep information. I thought, “How can I take this deep information and make it available right here and right there to give those interested extra elaboration on a concept while they’re reading something?”

This is when I decided I’ll dive deeper into the whole thing. Here’s my list. This is what I promised myself to do. I will minimize interruption for my readers. I will give them quick answers straight in the first paragraph. I will support easy scanning of my content. I will support trust by providing citations and references. I will provide in-depth content to those who want to see it. I will enable interactivity, personalization, and contextual relevance to the piece of content people want to retrieve in that particular time.

I took one of my big articles and I did a scroll test on it. This was the cutoff point where people read everything. At this point it drops to 95, 80, 85. You keep losing audience as your article grows in size. Eventually you end up at about 20% of people who visit your page towards the bottom of your article.

My first step was to jump on the Hemingway app—a very good online app where you can put in your content and it tells you basically all the unnecessary things you’ve actually put in your words—to actually take them out because they don’t really need to be there. I did that. I sized down my article, but it still wasn’t going to do the trick.

Enter the hypotext!

This is where I came up with an idea of hypotext. What I did, I created a little plugin for WordPress that enables people to go through my article, click on a particular piece, kind of like a link.

Instead of going to a new website, which does interrupt their reading experience, a block of text opens within the paragraph of text they’re reading and gives them that information. They can click if they like, or if they don’t want to look up this information, they don’t have to. It’s kind of like links, but injected right in the context of what they’re currently reading.

This was a nerve-wracking exercise for me. I did 500 revisions of this article until I got it right. What used to be a 5,000-word article turned into a 400-word article, which can then be expanded to its original 5,000-word form. People said, “That’s great. You have a nice hypothesis, nice theory, but does this really work?”

So I decided to put everything I did to a test. An old article, which takes about 29 minutes to read, was attracting people to the page, but they were spending 6 minutes on average—which is great, but not enough. I wanted people to spend way more time. If I put the effort into writing, I wanted them to digest that content properly. The bounce rate was quite high, meaning they were quite tired with my content, and they just wanted to move on and not explore anything else on my website.

Test Results

After implementing the compressed version of my original article, giving them a choice of what they will read and when, I expanded the average time on page to 12 minutes, which is extraordinary. My bounce rate was reduced to 60%, which meant that people kept browsing for more of my content.

We did a test with a content page, and the results were like this:

Basically, the engagement metrics on the new page were significantly higher than on the old when implemented in this way.

On a commercial landing page, we had a situation like this:

We only had a small increase in engagement. It was about 6%. Still very happy with the results. But what really, really surprised me was on my commercial landing page—where I want people to actually convert and submit an inquiry—the difference was huge.

It was about a 120% increase in the inquiries in comparison to the control group when I implemented this type of information. I removed the clutter and I enabled people to focus on making the inquiry.

I want you all to think about how you write for the web, what is a good web reading experience, and how content on the web should be, because I think it’s time to align how we write and how we read on the web. Thank you.

Video transcription by Speechpad.com

A few notes:

There are a few things to note here. First, for an example of an implementation of hypotext, take a look at this post on user behavior data.

Next, keep in mind that Google does devalue the hidden content, disagreeing with its usability. You can read more about this on the DEJAN blog—there are further tips on the dangers of hidden content and how you can combat them there.

One solution is to reverse how hypotext works in an article. Rather than defaulting to the shorter piece, you can start by showing the full text and offer a “5-minute-read” link (example here) for those inclined to skim or not interested in the deep content.

Share your thoughts in the comments below, and thanks for listening!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1ioNUxt

Click-Through Rate Isn’t Everything: 8 Ways to Improve Your Online Display Ads

Posted by rMaynes1

You are exposed to an average of 362 online display ads a day. How close are you to buying anything when you see those ads?

Online display ads have been around for over 20 years. They’re nothing new. But over the past 2 decades, the content, format, and messaging of display ads have changed dramatically—because they have had to!

The click-through rate of that first banner ad in 1994 was 44%. CTRs have steadily declined, and were sitting at around 0.1% in 2012 for standard display ads (video and rich media excluded), according to DoubleClick. Advertisers had to do something to ensure that their ads were seen, and engaged with—ads had to be a useful resource, and not an annoying nuisance.

It’s important, however, that the focus is not firmly fixed on CTRs. Yes, online display ads have largely been considered a tool for direct response advertising, but more recently, advertisers are understanding the importance of reaching the right person, in the right mindset, with an ad that can be seen. This ad may not be clicked on, but does that mean it wasn’t noticed and remembered? Advertisers are increasingly opting to pay for performance as opposed to clicks and/or impressions. Advertisers want their ad to drive action that leads to purchase—and that isn’t always in the form of a click.

Mediative recently conducted and released a research study that looks at how display ads can drive purchase behaviour. If someone is browsing the web and sees an ad, can it influence a purchase decision? Are searchers more responsive to display ads at different stages in the buying cycle? What actions do people take after seeing an ad that captures their interest? Ultimately, Mediative wanted to know how indicative of purchase behaviour a click on an ad was, and if clicks on display ads even matter anymore when it comes to driving purchase behaviour and measuring campaign success. The results from an online survey are quite interesting.

1. The ability of online display ads to influence people increases as they come closer to a purchase decision.

In fact, display ads are 39% more likely to influence web users when they are researching a potential purchase versus when they have no intent to buy.

Advertiser action item #1:

Have different ad creatives with different messaging that will appeal to the researcher and the purchaser of your product or service separately. Combined with targeted impressions, advertisers are more likely to reach and engage their target audience when they are most receptive to the particular messaging in the ad.

Here are a few examples of Dell display ads and different creatives that have been used:

This creative is focusing on particular features of the product that might appeal more to researchers.

This ad injects the notion of “limited time” to get a deal, which might cause people who are on the fence to act faster—but it doesn’t mention pricing or discounts.

These creatives introduce price discounts and special offers which will appeal to those in the market to buy.

2. The relevancy of ads cannot be understated.

40% of people took an action (clicked the ad, contacted the advertiser, searched online for more information, etc.) from seeing an ad because it was relevant to a need or want, or relevant to something they were doing at the time.

Advertiser action item #2:

Use audience data or lookalike modeling in display campaigns to ensure ads will be targeted to searchers who have a higher likelihood of being interested in the product or service. Retargeting ads to people based on their past activity or searches is valuable at this stage, as potential customers can be reached all over the web while they comparison shop.

An established Canadian charitable organization ran an awareness campaign in Q2 2015 using retargeting, first and third party data lookalike modeling, and contextual targeting to help drive existing, and new users to their website. The goal was to drive donations, while reducing the effective cost per action of the campaign. This combination helped drive granularity in the targeting, enabling the most efficient spending possible. The result was a 689% decrease in eCPA—$76 versus the goal of $600.

3. Clicks on ads are not the only actions taken after seeing ads.

53% of people said they were likely to search online for the product featured in the ad (the same as those who said they would click on the ad). Searching for more information online is just as likely as clicking the ad after it captures attention, just not as quickly as a click (74% would click on the ad immediately or within an hour, 52% would search online immediately or within an hour).

Advertiser action item #3:

It is critical not to measure the success of a display campaign by clicks alone. Advertisers can get caught up in CTRs, but it’s important to remember that ads will drive other behaviours in people, not just a click. Website visits, search metrics, etc. must all be taken into consideration.

A leading manufacturer of PCs, laptops, tablets, and accessories wanted to increase sales in Q2 of 2014, with full transparency on the performance and delivery of the campaign. The campaign was run against specific custom audience data focusing on people of technological, educational, and business interest, and was optimized using various tactics. The result? The campaign achieved a post-view ROI revenue (revenue from target audiences who were presented with ad impressions, yet did not necessarily click through at that time) that was 30x the amount of post-click revenue.

4. Clicks on ads are not the only actions that lead to purchase.

33% of respondents reported making a purchase as a direct result of seeing an ad online. Of those, 61% clicked and 44% searched (multiple selections were allowed), which led to a purchase.

Advertiser action item #4:

Revise the metrics you measure. Measuring “post-view conversions” will take into account the fact that people may see an ad, but act later—the ad triggers an action, whether it be a search, a visit, or a purchase—but not immediately, and it is not directly measurable.

5. The age of the target audience can impact when ads are most likely to influence them in the buying cycle.

  • Overall, 18–25 year olds are most likely to be influenced by online advertising.
  • At the beginning of the buying cycle, younger adults aged 18–34 are likely to notice and be influenced by ads much more than people aged over 35.
  • At the later stages of the buying cycle, older adults aged 26–54 are 12% more likely that 18–25 year olds to have made a purchase as a result of seeing an ad.

Advertiser action item #5:

If your target audience is older, multiple exposures of an ad might be necessary in order to increase the likelihood of capturing their attention. Integrated campaigns could be more effective, where offline campaigns run in parallel with online campaigns to maximize message exposure.

6. Gender influences how much of an impact display ads have.

More women took an online action that led to a purchase in the last 30 days, whereas more men took an offline action that led to a purchase.

  • 76% more women than men visited an advertiser’s website without clicking on the ad.
  • 47% more women than men searched online for more information about the advertiser, product, or service.
  • 43% more men than women visited the advertiser’s location.
  • 33% more men than women contacted the advertiser.

Advertiser action item #6:

Ensure you know as much about your target audience as possible. What is their age, their average income? What sites do they like to visit? What are their interests? The more you know about who you are trying to reach, the more likely you will be to reach them at the right times when they will be most responsive to your advertising messages.

7. Income influences how much of an impact display ads have.

  • Web users who earned over $100k a year were 35% more likely to be influenced by an ad when exposed to something they hadn’t even thought about than those making under $50k a year.
  • When ready to buy, people who earned under $20K were 12.5% more likely to be influenced by ads than those making over $100K.

Advertiser action item #7:

Lower earners (students, part-time workers, etc.) are more influenced by ads when ready to buy, so will likely engage more with ads offering discounts. Consider income differences when you are trying to reach people at different stages in the buying cycle.

8. Discounts don’t influence people if they are not relevant.

We were surprised that the results of the survey indicated that discounts or promotions in ads did not have more of an impact on people—but it’s likely that the ads with coupons were irrelevant to the searcher’s needs or wants, therefore would have no impact. We asked people what their reasons were behind taking action after seeing an online ad. 40% of respondents took an action from seeing an ad for a more purchase-related reason than simply being interested—they took the action because the ad was relevant to a need or want, or relevant to something they were doing at the time.

Advertiser action item #8:

Use discounts strategically. Utilizing data in campaigns can ensure ads reach people with a high intent to buy and a high likelihood of being interested in your product or service. Turn interest into desire with coupons and/or discounts—it will have more of an impact if directly tied to something the searcher is already considering.

In conclusion, to be successful, advertisers need to ensure their ads are providing value to online web users—to be noticed, remembered, and engaged with, relevancy of the ad is key. Serving relevant ads that are related to a searcher’s current need or want are far more likely to capture attention than a “one-size-fits-all” approach.

Advertisers will be rewarded for their attention to personalization with more interaction with ads and a higher likelihood of a purchase. Analyzing lower funnel metrics, such as post-view conversions, rather than simply concentrating on the CTR will allow advertisers to have a far better understanding of how their ads are performing, and the potential number of consumers that have been influenced.

Rebecca Maynes, Manager of Content Marketing and Research with Mediative, was the major contributor on this whitepaper. The full research study is available for free download at Mediative.com.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1Wk2ZgZ

Why All SEOs Should Unblock JavaScript & CSS… And Why Google Cares

Posted by jenstar

If you’re a webmaster, you probably received one of those infamous “Googlebot cannot access CSS and JS files on example.com” warning letters that Google sent out to seemingly every SEO and webmaster. This was a brand new alert from Google, although we have been hearing from the search engine about the need to ensure all resources are unblocked—including both JavaScript and CSS.

There was definite confusion around these letters, supported by some of the reporting in Google Search Console. Here’s what you need to know about Google’s desire to see these resources unblocked and how you can easily unblock them to take advantage of the associated ranking boosts.

Why does Google care?

One of the biggest complaints about the warning emails lay in the fact that many felt there was no reason for Google to see these files. This was especially true because it was flagging files that, traditionally, webmasters blocked—such as files within the WordPress admin area and WordPress plugin folders.

Here’s the letter in question that many received from Google. It definitely raised plenty of questions and concerns:

Of course, whenever Google does anything that could devalue rankings, the SEO industry tends to freak out. And the confusing message in the warning didn’t help the situation.

Why Google needs it

Google needs to render these files for a couple of key reasons. The most visible and well known is the mobile-friendly algorithm. Google needs to be able to render the page completely, including the JavaScript and CSS, to ensure that the page is mobile-friendly and to apply both the mobile-friendly tag in the search results and the associated ranking boost for mobile search results. Unblocking these resources was one of the things that Google was publicly recommending to webmasters to get the mobile-friendly boost for those pages.

However, there are other parts of the algorithm that rely on using it, as well. The page layout algorithm, the algorithm that looks at where content is placed on the page in relation to the advertisements, is one such example. If Google determines a webpage is mostly ads above the fold, with the actual content below the fold, it can devalue the rankings for those pages. But with the wizardry of CSS, webmasters can easily make it appear that the content is front and center, while the ads are the most visible part of the page above the fold.

And while it’s an old school trick and not very effective, people still use CSS and JavaScript in order to hide things like keyword stuffing and links—including, in the case of a hacked site, to hide it from the actual website owner. Googlebot crawling the CSS and JavaScript can determine if it is being used spammily.

Google also has hundreds of other signals in their search algo, and it is very likely that a few of those use data garnered from CSS and JavaScript in some fashion as well. And as Google changes things, there is always the possibility that Google will use it for future signals, as well.

Why now?

While many SEOs had their first introduction to the perils of blocking JavaScript and CSS when they received the email from Google, Matt Cutts was actually talking about it three-and-a-half years ago in a Google Webmaster Help video.

Then, last year, Google made a significant change to their webmaster guidelines by adding it to their technical guidelines:

Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.

It still got very little attention at the time, especially since most people believed they weren’t blocking anything.

However, one major issue was that some popular SEO WordPress plugins were blocking some JavaScript and CSS. Since most WordPress users weren’t aware this was happening, it came as a surprise to learn that they were, in fact, blocking resources.

It also began showing up in a new “Blocked Resources” section of Google Search Console in the month preceding the mobile-friendly algo launch.

How many sites were affected?

In usual Google fashion, they didn’t give specific numbers about how many webmasters received these blocked resources warnings. But Gary Illyes from Google did confirm that they were sent out to 18.7% of those that were sent out for the mobile-friendly warnings earlier this year:

@jenstar about 18.7% of that sent for mobile issues a few months back

— Gary Illyes (@methode) July 29, 2015

Finding blocked resources

The email that Google sent to webmasters alerting them to the issue of blocked CSS and JavaScript was confusing. It left many webmasters unsure of what exactly was being blocked and what was blocking it, particularly because they were receiving warnings for JavaScript and CSS hosted on other third-party sites.

If you received one of the warning letters, the suggestion for how to find blocked resources was to use the Fetch tool in Google Search Console. While this might be fine for checking the homepage, for sites with more than a handful of pages, this can get tedious quite quickly. Luckily, there’s an easier way than Google’s suggested method.

There’s a full walkthrough here, but for those familiar with Google Search Console, you’ll find a section called “Blocked Resources” under the “Google Index” which will tell you what JavaScript and CSS is blocked and what pages they’re found in.

You also should make sure that you check for blocked resources after any major redesign or when launching a new site, as it isn’t entirely clear if Google is still actively sending out these emails to alert webmasters of the problem.


There’s been some concern about those who use specialized scripts on internal pages and don’t necessarily want to unblock them for security reasons. John Mueller from Google said that they are looking primarily at the homepage—both desktop and mobile—to see what JavaScript and CSS are blocked.

So at least for now, while it is certainly a best practice to unblock CSS and JavaScript from all pages, at the very least you want to make it a priority for the homepage, ensuring nothing on that page is blocked. After that, you can work your way through other pages, paying special attention to pages that have unique JavaScript or CSS.

Indexing of Javascript & CSS

Another reason many sites give for not wanting to unblock their CSS and JavaScript is because they don’t want them to be indexed by Google. But neither of those files are file types that Google will index, according to their long list of supported file types for indexation.

All variations

It is also worth remembering to check both the www and the non-www for blocked resources in Google Search Console. This is something that is often overlooked by those webmasters that only to tend to look at the version they prefer to use for the site.

Also, because the blocked resources data shown in Search Console is based on when Googlebot last crawled each page, you could find additional blocked resources when checking them both. This is especially true for for sites that may be older or not updated as frequently, and not crawled daily (like a more popular site is).

Likewise, if you have both a mobile version and a desktop version, you’ll want to ensure that both are not blocking any resources. It’s especially important for the mobile version, since it impacts whether each page gets the mobile-friendly tag and ranking boost in the mobile search results.

And if you serve different pages based on language and location, you’ll want to check each of those as well. Don’t just check the “main” version and assume it’s all good across the entire site. It’s not uncommon to discover surprises in other variations of the same site. At the very least, check the homepage for each language and location.

WordPress and blocking Javascript & CSS

If you use one of the “SEO for WordPress”-type plugins for a WordPress-based site, chances are you’re blocking Javascript and CSS due to that plugin. It used to be one of the “out-of-the-box” default settings for some to block everything in the /wp-admin/ folder.

When the mobile-friendly algo came into play, because those admin pages were not being individually indexed, the majority of WordPress users left that robots block intact. But this new Google warning does require all WordPress-related JavaScript and CSS be unblocked, and Google will show it as an error if you block the JavaScript and CSS.

Yoast, creator of the popular Yoast SEO plugin (formerly WordPress SEO), also recommends unblocking all the JavaScript and CSS in WordPress, including the /wp-admin/ folder.

Third-party resources

One of the ironies of this was that Google was flagging third-party JavaScript, meaning JavaScript hosted on a third-party site that was called from each webpage. And yes, this includes Google’s own Google AdSense JavaScript.

Initially, Google suggested that website owners contact those third-party sites to ask them to unblock the JavaScript being used, so that Googlebot could crawl it. However, not many webmasters were doing this; they felt it wasn’t their job, especially when they had no control over what a third-party sites blocks from crawling.

Google later said that they were not concerned about third-party resources because of that lack of control webmasters have. So while it might come up on the blocked resources list, they are truly looking for URLs for both JavaScript and CSS that the website owner can control through their own robots.txt.

John Mueller revealed more recently that they were planning to reach out to some of the more frequently cited third-party sites in order to see if they could unblock the JavaScript. While we don’t know which sites they intend to contact, it was something they planned to do; I suspect they’ll successfully see some of them unblocked. Again, while this isn’t so much a webmaster problem, it’ll be nice to have some of those sites no longer flagged in the reports.

How to unblock your JavaScript and CSS

For most users, it’s just a case of checking the robots.txt and ensuring you’re allowing all JavaScript and CSS files to be crawled. For Yoast SEO users, you can edit your robots.txt file directly in the admin area of WordPress.

Gary Illyes from Google also shared some detailed robots.txt changes on Stack Overflow. You can add these directives to your robots.txt file in order to allow Googlebot to crawl all Javascript and CSS.

To be doubly sure you’re unblocking all JavaScript and CSS, you can add the following to your robots.txt file, provided you don’t have any directories being blocked in it already:

User-Agent: Googlebot
Allow: .js
Allow: .css

If you have a more specialized robots.txt file, where you’re blocking entire directories, it can be a bit more complicated.

In these cases, you also need to allow the .js and.css for each of the directories you have blocked.

For example:

User-Agent: Googlebot
Disallow: /deep/
Allow: /deep/*.js
Allow: /deep/*.css

Repeat this for each directory you are blocking in robots.txt.

This allows Googlebot to crawl those files, while disallowing other crawlers (if you’ve blocked them). However, the chances are good that the kind of bots you’re most concerned about being allowed to crawl various JavaScript and CSS files aren’t the ones that honor robots.txt files.

You can change the User-Agent to *, which would allow all crawlers to crawl it. Bing does have its own version of the mobile-friendly algo, which requires crawling of JavaScript and CSS, although they haven’t sent out warnings about it.

Bottom line

If you want to rank as well as you possibly can, unblocking JavaScript and CSS is one of the easiest SEO changes you can make to your site. This is especially important for those with a significant amount of mobile traffic, since the mobile ranking algorithm does require they both be unblocked to get that mobile-friendly ranking boost.

Yes, you can continue blocking Google bot from crawling either of them, but your rankings will suffer if you do so. And in a world where every position gained counts, it doesn’t make sense to sacrifice rankings in order to keep those files private.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1H9SBS4

Are Your Analytics Telling the Right Story?

Posted by Bill.Sebald

A process can easily become a habit. A habit may not change without awareness or intervention.

Before it becomes a habit, a process should be adjusted to change along with new goals, constant learning, experimentation, and so on.

Considering your time in analytics, are you engaging in a process, or in an outdated habit?

That’s a real question that digital marketing practitioners should ask themselves. Inherently, marketers tend to be buried with work, reusing templates to speed up results. But many agencies lean on those templates a little too much, in my opinion.

Templates should never be written in stone.

If your company is pumping out canned reports, you’re not alone. I do the business development for our company and regularly ask prospects to explain or share the reports they’ve received in the past. Sometimes it’s truly discouraging, outdated, wasteful, and the reason businesses search for new SEO vendors.

Look—I’m all for scalability. It’s a huge help. But some things can’t be scaled and still be successful, especially in today’s SEO climate—or, frankly, marketing in general. Much of what was scalable in SEO prior to 2011 is now penalty-bait. Today’s analytics tools and platforms can slice and dice data faster than anything Ron Popeil ever sold, but the human element will always be necessary if you want your marketing to dominate.

Find the stories to tell

I like to tell stories. I’m real fun in the pub. What I’ve always loved about marketing is the challenge to not only find a story, but have that story change something for the better. I like adding my layer based on real data and experimenting.

Analytics work is all about finding the story. It’s detective work. It’s equal parts Sherlock Holmes, Batman, and Indiana Jones. If you’re lucky, the story jumps out with very little digging. However, it’s more likely you’ll be going on some expeditions. It’s common to start with a hunch or random click through reports, but you need to always be looking for the story.

A great place to start is through client conversations. We schedule at least one monthly call with our clients, where it’s truly a discussion session. We get conversations going to pull intel out of the key stakeholders. Case in point: Recently, we discovered through an open discussion that one of our clients had great success with an earlier email campaign targeted to business owners. There was specific information customers positively responded to, which was helpful in recent content development on their website. It’s amazing what you can learn by asking questions and simply listening to responses.

We should be true consultants, not report monkeys. Dive into the discussions started and enjoy the ride. I guarantee you’ll take note of a few ripe areas to review next time you log into your Google Analytics account.

An impromptu survey says it’s a time issue

Most SEO engagements are designed around a block of purchased hours. Hopefully the client understands they’re not only buying your time to complete SEO tasks, but also your expertise and analysis. If someone on your team were to say, “I don’t have time to do analysis because all my tasks used up their budget this month,” then you really need to question the value of the chosen tasks. Were they picked based on front-loaded analysis, or were they simply tasks pulled out of guesswork?

A few weeks ago I pushed a quick Survey Monkey survey out on Twitter and Linkedin. Thanks to a few retweets, 94 people responded (please consider the following results more directional than scientific—I’m well aware it’s a shallow survey pool). I asked two questions:

  1. If you work in-house or have clients, how often do you log into your clients’ analytics? (Multiple choices ranged from several times a day to a few times a month).
  2. Do you, or do you not, get enough time in Analytics to interpret the data?

The responses:


While some do make a habit of logging into analytics once or more times a day, more do not. Is it required to check under the hood every day? Personally, I believe it is—but your answer may vary on that one. If something went south overnight, I want to be aware before my client tells me. After all, that’s one of the things I’m paid for. I like the idea of being active—not reactive.

More notable is that most respondents didn’t feel they get enough time in analytics. That should absolutely change.

There was also a field for respondents to elaborate on their selections. There were several comments that jumped out at me:

“In house, day to day tasks and random projects prevent me from taking the deep dives in analytics that I feel are valuable.”

“It’s challenging to keep up with the changes and enhancements made in Google Analytics in particular, amongst other responsibilities and initiatives.”

“Too many things are on my plate for me to spend the time I know I should be spending in Google Analytics.”

“Finding the actionable info in Analytics always takes more time that expected—never enough time to crunch the numbers!”

“I log in to ‘spot check’ things but rarely do I get to delve into the data for long enough to suss out the issues and opportunities presented by the data.”

These results suggest that many marketers are not spending enough time with analytics. And possibly not because they don’t see the value, but simply because they don’t have time. “Either you run the day, or the day runs you (Jim Rohn)” is apropos here—you must make time. You need to get on top of all the people filling your plate. It’s not easy, but it needs to be done.

Get on top of those filling your plate. Kind of like professional crowd surfing.

Helpful resources

Dashboards are fantastic, but I rarely see them set up in analytics platforms. One of the best ways to get a quick glimpse of your key metrics are with dashboards. All good analytics platforms provide the ability to make custom dashboards. Get into work, grab a coffee, fire up the computer, click your dashboard bookmark. (I recommend that order!) Google Analytics, which most of us probably use, provides some decent options with their dashboards, though limited compared to enterprise analytics platforms.

However, this basic dashboard is the minimum you should review in analytics. We’ll get deeper soon.

Building these widgets are quite easy (I recently created a tutorial on my site). There are also websites that provide dashboards you can import into Google Analytics. Dashboard Junkie is a fun one. Here are some others from Econsultancy and Google themselves.

It’s not just analytics platforms that offer dashboards. There are several other vendors in the SEO space that port in analytics data and mesh with their own data—from Moz Analytics to SearchMetrics to Conductor to many, many others.

SEMrush has a unique data set that marketers should routinely review. While your traffic data in analytics will be truer, if you’re targeting pages you may be interested in monitoring keyword rank counts:

Are backlinks a target? Maybe you’d find Cognitive SEO’s dashboard valuable:


RankRanger is another SaaS we use. It’s become way more than just our daily rank tracking software. The data you can port in creates excellent snapshots and graphs, and strong dashboards:


It also offers other graphing functionality to make pretty useful views:

While some of the bigger platforms, like SearchMetrics and Conductor, make it easier to get a lot of information within one login, I’m still finding myself logging into several programs to get the most useful data possible. C’est la vie.

Analytics is your vehicle to identifying problems and opportunity

Remember, dashboards are simply the “quick and dirty” window into your site. They help spotlight drastic changes, and make your website’s general traction more visible. Certainly valuable for when your CMO corners you by the Keurig machine. It’s a state of the union, but doesn’t focus on subsections that may need attention.

Agencies and consultants tend to create SEO reports for their clients as a standard practice, though sometimes these reports become extremely boilerplate. Boilerplate reports essentially force you to look under the same rocks month after month. How can you get a bigger view of the world if you never leave your comfortable neighborhood? A new routine needs to be created by generating new reports and correlations, finding trends that were hidden, and using all the tools at your disposal (from Analytics to link tools to competitive tools).

Your analytics app is not a toy—it’s the lifeblood of your website.

Deeper dives with Google Analytics

Grouped pages lookup

A quick way to look at chunks of the site is by identifying a footprint in the URL and searching with that. For example, go to Behavior > Site Content > All Pages or Landing Pages. Then, in the search bar right below the graph, search for the footprint. For example, take http://ift.tt/1S7JyaK as a real URL. if you want to see everything in the blog, enter */blog/ into the search bar. This is especially useful in getting the temperature of an eCommerce category.

Segment sessions with conversions/transactions

So often in SEO we spend our time analyzing what’s not working or posing as a barrier. This report helps us take a look at what is performing (by leads or sales generated) and the customer behavior, channels, and demographic information that goes along with that. Then we can identify opportunities to make use of our success and improve our overall inbound strategy.

Below is a deeper dive into the conversions “Lead Generation” segment, although these same reports can just as aptly be applied to transactions. Ultimately, there are a lot of ways to slice and dice the analysis, so you’ll have to know what makes sense for your client, but here are three different reports from this segment that provided useful insights that will enhance our strategy.

  • Conversions
    One of the easy and most valuable ones! Directions: Under any report, go to Add a Segment > Sessions with Conversions > Apply.
  • Demographics – age, gender, location
    For example, our client is based in Pennsylvania, but is receiving almost as many request form submissions from Texas and New York, and has a high ratio of request form submissions to visitors for both of these other states. Given our client’s industry, this gives us ideas on how to market to these individuals and additional information the Texans may need given the long distance.
  • Mobile – overview, device type, landing pages
    For this client, we see more confirmation of what has been called the “micro-moment” in that our mobile users spend less time on the site, view less pages per visit, have a higher bounce rate, and are more likely to be new users (less brand affinity). This would indicate that the site is mobile optimized and performing as expected. From here, I would next go into mobile traffic segments to find pages that aren’t receiving a lot of mobile traffic, but are similar to those that are, and find ways to drive traffic to those pages as well.
  • Acquisition
    Here we’re looking at how the inbound channels stack up for driving conversions. Organic and Paid channels are neck and neck, although referral and social are unexpected wins (and social, glad we’ve proven your viability to make money!). We’ll now dig deeper into the referring sites and social channels to see where the opportunities are here.

Assisted conversions

There’s more to the story than last click. In Analytics, go to Conversions > Multi-Channel Funnels > Assisted conversions. Many clients have difficulty understanding the concept of attribution. This report seems to provide the best introduction to the world of attribution. Last click isn’t going to be replaced anytime soon, but we can start to educate and optimize for other parts of the funnel.

True stories from analytics detective work

Granted, this is not a post about favorite reports. But this is a post about why digging through analytics can open up huge opportunities. So, it’s real-life example time from Greenlane’s own experience!

Story 1: The Forgotten Links

The client is a big fashion brand. They’ve been a popular brick-and-mortar retail destination since the early 80s, but only went online in 1996. This is the type of company that builds links based on their brand ambassadors and trendy styles. SEO wasn’t the mainstream channel it is today, so it’s likely they had some serious architecture changes since the 90s, right?

For this company, analytics data can only be traced back about seven years. We thought, “Let’s take a look at what drove traffic in their early years. Let’s see if there were any trends that drove volume and sales where they may be slipping today. If they had authority then, and are slipping now, it might be easier to recoup that authority versus building from scratch.”

The good news—this brand had been able to essentially maintain the authority they launched with, as there were not any real noticeable gaps between search data then and search data today. But, in the digging, we uncovered a gem. We found a lot of URLs that used to draw traffic that are not on their tree today. After digging furthur, we found a redesign occurred in the late 90s. SEO wasn’t factored in, creating a ton of 404s. These 404s were not even being charted in Google Webmaster Tools, yet they are still being linked to today from external sites (remember, GWT is still quite directional in terms of the data they provide). Better yet, we pulled links from OSE and Majestic, and saw that thousands of forgotten links existed.

This is an easy campaign—create a 301 redirect matrix for those dead pages and bring those old backlinks to life.

But we kept wondering what pages were out there before the days where analytics was implemented. Using the Wayback Machine, we found that even more redesigns had occurred in the first few years of the site’s life. We didn’t have data for these pages, so we had to get creative. Using Screaming Frog, we crawled the Wayback Machine to pull out URLs we didn’t know existed. We fed them into the link tools, and sure enough, there were links there, too.

Story 2: To “View All” or Not To “View All”

Most eCommerce sites have pagination issues. It’s a given. A seasoned SEO knows immediately to look for these issues. SEOs use rel=”next” and “prev” to help Google understand the relationships. But does Google always behave the way we think they should? Golly, no!

Example 2 is a company that sells barware online. They have a lot of products, and tend to show only “page 1” of a given category. Yet, the analytics showed instances where Google preferred to show the view all page. These were long “view all” pages, which, after comparing to the “page 1” pages, showed a much lower bounce rate and higher conversions. Google seemed to prefer them in several cases anyway, so a quick change to default to “view all” started showing very positive returns in three months.

Story 3: Selling What Analytics Says to Sell

I have to change some details of this story because of NDAs, but once upon a time there was a jewelry company that sold artisan products. They were fond of creating certain kinds of keepsakes based on what sold well in their retail stores. Online, though, they weren’t performing very well selling these same products. The website was fairly new and hadn’t quite earned the footing they thought their brand should have, but that wasn’t the terminal answer we wanted to give them. Instead, we wanted to focus on areas they could compete with, while building up the entire site and turning their offline brand into an online brand.

Conversion rates, search metrics, and even PPC data showed a small but consistent win on a niche product that didn’t perform nearly as well in the brick-and-mortar stores. It wasn’t a target for us or the CEO. Yet online, there was obvious interest. Not only that, with low effort, this series of products was poised to score big in natural search due to low competition. The estimated search volume (per Google Keyword Planner) wasn’t extraordinary by any stretch, but it led to traffic that spent considerable dollars on these products. So much so, in fact, that this product became a focus point of the website. Sometimes, mining through rocks can uncover gold (jewelry pun intended).


My biggest hope is that your takeaway after reading this piece is a candid look at your role as an SEO or digital marketer. You’re a person with a “unique set of skills,” being called upon to perform works of brilliance. Being busy does create pressure; that pressure can sometimes force you to look for shortcuts or “phone it in.” If you really want to find the purest joy in what you’ve chosen as a career, I believe it’s from the stories embedded within the data. Go get ’em, Sherlock!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1N5BYNG

How to Get Your App Content Indexed by Google

Posted by bridget.randolph

As mobile technology becomes an increasingly common way for users to access the internet, you need to ensure that your mobile content (whether on a mobile website or in a mobile app) is as accessible to users as possible. In the past this process has been relatively siloed, with separate URLs for desktop and mobile content and apps tucked away in app stores.

But as app and mobile web usage continues to rise, the ways in which people access this content is beginning to converge, which means it’s becoming more important to keep all of these different content locations linked up. This means that the way we think about managing our web and mobile content is evolving:

So how do we improve the interaction between these different types of content and different platforms, getting to the point of being able to have a single URL which takes the user to the most appropriate version of the content based on their personal context?

The first step is to ensure that we are correctly implementing deep linking (e.g., linking to a particular screen within an app) for apps which have comparable webpage content, to allow for our app content to rank in mobile search.

Image credit: Google Developers

Google indexation provides benefits for both Android and iOS apps. The benefits for Android apps are twofold:

  • users searching on an Android device who have not yet installed your app will see the app show up in mobile search results; and
  • Android users who do have your app installed will get query autocompletions when they use browser search which can include results from your app, as well as seeing enhanced display elements in the SERP (such as the app icon). It’s basically like rich snippets for apps.

Image credit: Google Developers

On iOS, app ranking is currently only supported for apps already installed on the device. Apple users should see search results which include links to installed apps and also include the enhanced display elements mentioned above.

In addition, Google recently announced that mobile apps which use the new App Indexing API for deep linking may receive a rankings boost in mobile web search. They are releasing a new and improved version of Google Now, “Now on Tap,” in their latest OS update (Android M), which allows you to search content across your phone without navigating out of whatever app (or website) you are currently using. The catch is, that app content has to be in their index in order to be included in a “Now on Tap” search.

It’s not just Google, either; Apple is implementing their own version of a search index to allow iOS9 users to search and discover web and app content without using a third-party search engine, Bing has its own approach to app indexation and ranking, and other services aren’t far behind.

This post, however, will focus on how to setup your Android and iOS apps to appear in Google search results. While the idea of app indexation isn’t new, it is an area of rapid innovation and the process for getting your apps indexed by Google has recently been simplified. This post is therefore intended to provide a brief overview of that process and to serve as an update to the information which is currently available.

The implementation

The good news is that it’s getting simpler to add the relevant markup to your web content and get your app content indexed and ranking in mobile search results.

The basic process is only three steps:

  1. Support HTTP deep links in your mobile app. For iOS you will need to do this by setting up support for “Universal Links.” “Universal Links” are what Apple calls HTTP links that have a single URL which can open both a specific page on a website and the corresponding view in an app.
    Note: At this point, you can register your app with Google, associate it with your website and stop there—as long as you are using the same URLs for your web content and your app content, they should be able to automatically crawl, index, and attempt to rank your app content based on your website’s structure. However, implementing App Indexing and explicitly mapping your web content to your app content using on-page markup can provide additional benefits and allow for a bit more control. Therefore, I recommend following the full process, if possible.
  2. Implement Google App Indexing using the App Indexing API for Android, or by integrating the App Indexing SDK for iOS 9.
  3. Explicitly map your web pages to their corresponding app screens using either a rel=alternate link element on the individual page, by referencing the app URLs in your XML sitemaps, or by using schema.org markup.

You can find a more step-by-step explanation of this process (looking at Android and iOS separately) below.

The app indexation process used to be a bit more complex, because HTTP links aren’t supported by older iOS versions. Instead, developers had to use something called “Custom URL Schemes” to link to iOS app content. This meant that you essentially had to create a unique scheme for your app URLs and then add support for these in the app code.

Custom URL schemes have a couple other downsides besides adding complexity, namely:

  • different app developers can claim the same custom URL scheme, whereas with HTTP links you can associate the app to a particular domain or set of domains; and
  • with custom URL schemes, tapping the URL when the app isn’t installed results in a broken link (because it only links to content within the app), whereas HTTP links are web links as well and can take the user to a webpage if the app isn’t installed (as long as the URL is the same for both the app view and the corresponding webpage).

While you can still use the custom URL scheme approach, the good news is that Google’s App Indexing is now compatible with HTTP deep link standards for iOS 9, which Apple calls “Universal Links.”

You should still add markup to any webpages which have content corresponding to a particular app screen. Think of it like like rel=canonical or like mobile switchboard tags, but for apps. Be aware that when Google finds a link between a webpage and an app page which they think are equivalent, they will compare the two pages and you will receive a ‘Content Mismatch’ error in the Search Console if they don’t believe the content is similar enough.

Getting Android apps indexed in Google

Step 1: Support HTTP deep links in your app by adding intent filters to your manifest.

An intent filter is a way of specifying how an app responds to a particular action. Intent filters for deep links have three required elements: <action>, <category>, and <data>. You can find more guidance on this from Google Developers. Here is their example of an intent filter which enables support for HTTP deep links:

<intent-filter android:label="@string/filter_title_viewrecipes">

<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
<data android:scheme="http"
android:pathPrefix="/recipes" />

Noindex option:
Just like for websites, you can add noindex directives for app content as well. Include a noindex.xml file in your app to indicate which deep links should not be indexed, and then reference that file in the app’s manifest (AndroidManifest.xml) file. You can find more detail on how to create and reference the noindex.xml file here.

Step 2: Associate your app to your site in Google Search Console.

This is done in Google Search Console (you can also do it from the Developer Console). As long as your app is set up to support deep links, this step is technically all you have to do to allow Google to start indexing your app. It will allow Google to index and crawl your app automatically by attempting to figure out the app structure from your website structure.

However, if you do stop here, you will not have as much control over how Google understands your content, which is why the explicit mapping of pages to app versions is recommended. Also, if you can’t use the API for some reason, you need to make sure that Googlebot can access your content. You can check that this is configured correctly in your site’s robots.txt file by testing some of your deep links using the robots.txt tester tool in the Search Console.

Step 3: Implement app indexing using the App Indexing API.

Using the App Indexing API is definitely worthwhile; apart from anything else, apps which use the API should receive a rankings boost in mobile search results, and you don’t need to worry about Googlebot struggling to access your content.

The App Indexing API allows you to annotate information about the activities within your app that support deep links (as laid out in your intent filters). For details on how to set this up, see the Google Developers guidance.

Step 4: Test your implementation.

You can test your implementation (always on a fresh installation of your app!) with the following tools. (Find more info about how to use each of these tools here.)

Android Debug Bridge – to test deep links from the command line

Fetch as Google (Search Console) – to test what Google sees when it crawls your app deep links

You can also track search traffic to these deep links in the Search Console’s Search Analytics report.

Getting iOS apps indexed in Google

Step 1: Support HTTP deep links in your app by setting up support for “Universal Links.”

To support universal links in your iOS app, you need to first ensure that your app handles these links correctly by adopting the UIApplicationDelegate methods (if it doesn’t already use this protocol). Once this is in place, you can associate your app with your domain.

You’ll do this by:

  • adding an “associated domains” entitlement file to your app’s project in XCode that lists each domain associated with your app; and
  • uploading an apple-app-site-association file to each of these domains with the content your app supports—note that the file must be hosted at the root level and on a domain that supports HTTPS.

To learn more about supporting Universal Links, view the Apple Developer guidance.

Step 2: Register your app with Google (using the GoogleAppIndexing SDK for iOS 9).

You’ll need to add the App Indexing SDK to your app using the CocoaPods dependency manager. For step by step instructions, check the Google Developers’ guide. Basically what this does is allows you to register your app with Google, just like Android apps are registered via the Search Console. This also means that Google can now read the apple-app-site-association file to understand what URLs your app can open.

Step 3: Test your implementation.

You can test whether this is set up correctly by tapping a universal link in Safari on an iOS 9 device and checking that it opens the right location in your app.

Mapping your webpages to your app with on-page markup or sitemaps

Once you’ve set up the deep linking support for your Android and/or iOS app(s), the final step is to explicitly identify the corresponding webpages to the correct app screens using one of the supported markup options. This step allows you to indicate more clearly to Google what the relationship is between a given page and its corresponding app link (both of which should already share the same URL if you are using HTTP links). Following this step also allows you to indicate the relationship to Bing crawlers, which otherwise wouldn’t see the app content, and to allow Apple to index your iOS app.

You can do this mapping either in the head of the individual page using a link element, using schema.org markup (for Android only), or in an XML sitemap.

A note on formats for app links

The format for an Android HTTP link uses the format of:


The {package_name} is the app’s “Application ID,” which is how it is referenced in the Google Play Store. So a link to the (example) Gizmos app might look like this:


For iOS links, you use the app’s iTunes ID instead of the Package Name. So an iOS app URL uses this format:


For HTTP links the {scheme} is “http,” which would mean your URL would look like this:


How to reference your app links

Note: Google provides guidance on the three currently supported deep link methods here.

Option 1: Link rel=alternate element

To add an app link reference to an individual page, you can use an HTML <link> element in the <head> of the page.

Here is an example of how this might look if you have both an iOS and Android app:

<link rel="alternate" href="android-app://com.gizmos.android/http/gizmos.com/example" />
<link rel="alternate" href="ios-app://123456/http/gizmos/example" /></head>
<body> … </body>

Option 2: Schema.org markup (currently supported on Android only)

Alternatively, if you have an Android app, you can use schema.org markup for the ViewAction potential action on an individual page to reference the corresponding app link.

Here is an example of how this might look:

script type="application/ld+json">
"@context": "http://schema.org",
"@type": "WebPage",
"@id": "http://ift.tt/1jKlLm7",
"potentialAction": {
"@type": "ViewAction",
"target": "android-app://com.gizmos.android/http/gizmos.com/example"

Option 3: Add your app deep links to your XML sitemap

Instead of marking up individual pages, you can use an <xhtml:link> element in your XML sitemap, inside the <url> element specifying the relevant webpage.

Here is an example of how this would look if you have both an iOS and an Android app:

<?xml version="1.0" encoding="UTF-8" ?>
<xhtml:link rel="alternate" href="ios-app://123456/http/gizmos/example" /></url>
<xhtml:link rel="alternate" href="android-app://com.gizmos.android/http/gizmos.com/example" />

Additional information

What about apps which don’t have corresponding web pages?

Unfortunately, as of this writing, Google does not officially offer app indexation for apps which don’t have corresponding web content. However, they are trying to move in this direction, and as such are beginning to try this out with a handful of apps with “app-only” content. If you have an app with app-only content, and would like to get this content indexed, you can express interest using this form.

What about getting my app indexed in Bing?

Bing supports two open standard options for linking webpages to app links:

  • App Links
  • Schema.org

To learn more about how to implement these types of markup, see the guidance on the Bing blog.

Quick reference checklists

Will Critchlow recently spoke about app indexation in his presentation at Searchlove London. He provided two useful checklists for Android and iOS app indexing:

Image source: http://ift.tt/1R850vq;

To learn more about app indexing by Google, check out Emily Grossman and Cindy Krum’s excellent post over on SearchEngineLand.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Beverly’s LTD http://ift.tt/1RwzxTs

Just the Tips! How to Manage Your Twitter Account More Effectively


Are you sure you’re taking full advantage of your Twitter account’s potential? Wouldn’t it be nice to save time and work by streamlining your Twitter process? Now that you know the basic tips, in this blog post we are going to go a little bit deeper and give you 14 Tips to Manage Your Twitter Account More Effectively.


1) Pin Your Best and Most Recent Content to Your Feed

Twitter now gives you the option to ‘pin’ a tweet to the top of your timeline. When your content is ‘pinned’ all your followers can see it when they go to your twitter stream. It’s a good idea to pin your most recent and important posts, so that new followers can see it right away and until you decide to unpin it. Also, did you know that if you pin GIFs they will automatically play when someone goes to your profile?

how to manage your twitter Canva

2) Keep Track of Engagement

This is really important if you want to manage your Twitter account effectively. It’s necessary to see whether or not your followers are engaging with your posts or not. Are you getting retweets and favorited as often as you would like?

Check your Twitter Analytics frequently and analyze which content receives the most engagement and which content does not. Focus on the content that does and reconsider changing your content strategy if your posts have low engagement.  If your posts are getting low engagement, you might also want to re-think your titles. Check out this blog post to learn how to write kick-ass blog titles and entice your followers to click!


3) Set up Twitter Lists

Lists on Twitter are great for organizing and reaching out to influencers. First, start making lists to organize your feed. To do so, just go to someone’s profile, click on “Settings” then “Add or Remove from lists.”  You will have the option to create new lists. Some ideas for creating lists for your business could be organizing them by:


  • competitors
  • customer relationships
  • industry leaders
  • trends
  • employees/team members
  • events
  • blogs you like
  • your biggest fans


how to manage your twitter with lists

Secondly, you can use lists to find influencers in your industry. Once on someone’s profile you can see the lists that they are apart of by clicking on “Lists” then “Member of”. See what other lists they are apart of and find other really influential people to try to reach out to.


4) Use Awesome Hashtags


Hashtags are extremely useful for finding your target audience, participating in online conversations centered around shared interests, and getting new people to see your content. Also, tweets with hashtags get twice as much engagement! However, it’s important to understand that there is a science behind using hashtags, how many, how long, how to find them and how to analyze them. We cover all of that, plus give you really useful tools in our blog post The Ultimate Guide to Using Hashtags.


5) Be Selective with New Followers


Yes, growing your followers is important, but it’s useless to have gain new followers if they aren’t really interested in your business or industry. Thankfully, there are apps like ManageFlitter that let you search for a certain demographic of users. For example, people who tweet in “XXX language”, who have “XXX keyword” in their bio, are included in over 50 lists, etc. Bottom line, gain followers who will be interested in your content, and therefore will engage in you content.  


6) Take Advantage of Direct Messages


Even though DM’s have gotten a bad rap, if you use them correctly they can be extremely effective for your marketing campaign. It’s important to segment your audience and send very specific, relevant messages to your users. For example, to promote specific webinars, events, promos, etc. Some ways to make your DM campaigns more effective: Include added value in your messages, and personalize it to include the person’s first name. Beware of using spam wordsfree” or “click here” as users might not trust them and/or will be deemed as spam in email platforms.


7) Use Postcron to Schedule Your Tweets


This works great, especially if you manage more than one Twitter account. You can use Postcron to schedule tweets throughout the day, and even set up Predetermined Publishing Times to start scheduling your tweets more efficiently and during times that when your audience in online most.


how to manage your twitter with Postcron

8) Activate Your Twitter Cards


This awesome new Twitter feature allows you to include more information about your posts, pictures, videos, or app explanations. By activating Twitter Cards you are increasing your chance of getting retweeted or favorited. Check out our Guide to Using Twitter Cards.


how to manage your twitter with Cards


9) Check the Growth and Decline of Your Followers


Check to see if you are gaining or losing followers. Use this information to adjust and manage your Twitter Strategy accordingly. Twitter Analytics allows you to perform this task frequently, and for free!


10) Know Your Audience


Get to know not only when your followers are online, but also what they like and are interested in. If you find that they are not interested the types of things you are tweeting you will need to reach out to new followers. Use a tool like Simply Measured to find out information about location, influence, behaviour, and interests.


how to manage your twitter

11) Monitor Your Keywords


Think of keywords related to your brand, product, or service. It’s worth keeping track of how these words are used over the internet in order to boost your SEO success and ROI. Use tools like Topsy to see reactions, and posts related to your keywords across the internet.

how to manage your twitter with Topsy


SproutSocial is another great tool for monitoring keywords.

how to manage your twitter with SproutSocial

12) Clean Up Your Account


As time goes one you will start to get fake or inappropriate followers. It’s important to clean up your account every once in awhile, probably once a month. Some users might have spammy tweets, or not very many followers themselves, or be without a profile picture. In these cases, use tools like Unfollowers.com to detect and remove these people.


13) See What Your Competitors Are Up To


Want to gain more followers that will be interested in your content?? Yeah, we thought so! A great way to find just those people is to take a look at who’s following your competitors. Try following some of them and when they follow you back, you will probably see a boost in engagement. Crowdfire (previously JustUnfollow) has a feature called Copy Followers that allows you to enter any Twitter account in your field and they put together a list of its followers for you. This is a great way to find a targeted audience that is active and engaging.


14) Reshare Your Content!


Do you have lots of articles that are still relevant to your followers. Twitter is a great platform to reshare evergreen content since your old followers may have missed it or forgotten about it, and your new followers have never seen it! It’s a good idea to create a content calendar for re-sharing your old content. For creative ways to repurpose your content check out this blog post.

Wrap it up!


We hope that you can apply some of these tips to your social media marketing strategy for your business, brand or personal account. Taking an active role to manage your Twitter account will help you reach your followers more effectively.


Let us know which tip you haven’t been using already or if you have any others to offer! Look forward to hearing from you! Thanks and don’t forget to share this post!


The post Just the Tips! How to Manage Your Twitter Account More Effectively appeared first on Postcron Blog.

from Beverly’s LTD http://ift.tt/1O0XaWG