Building an Evergreen Machine: How to Write Evergreen Content

By Andrew Tweddle

Evergreen content; it just sounds calming. You can solve all your problems by simply saying: “Don’t worry, we’ll just create some evergreen content and then sit back and watch the traffic roll in forever.” Done.

Except, obviously, nothing is that simple in practice. How do you ensure the content you create will be evergreen and that it will generate enough interest to be worth your time? After all, a piece of content that gets one view a week, every week, is technically evergreen, but I wouldn’t recommend writing it.

This piece aims to get a concrete handle on the vague idea of evergreen content, including how to recognise it, how to plan your content strategy around it, and how to compound your success into an ever-growing traffic level.

What makes content evergreen?

Evergreen content, just like trees that never lose their leaves, is content that never loses a residual level of traffic. It will consistently generate interest over time, and people will still come looking for it for a long time to come. Here at Distilled, we’ve had success from this in the past. One piece that stands out, simply because of its age and the volume of traffic it receives, is our Omniture guide. Since it was published in March 2015, over two years ago, it has received a total of 144,000 page views, with every month since it was published receiving between 6,800 and 2,700 views:

But what makes this evergreen? Well, the obvious answer is that the topic isn’t something time sensitive. However, there’s also the simple matter that it’s not something specific to our company. It’s not an update, or pat-on-the-back blog post. It’s a resource for many people working in digital marketing. However, most important, is that it’s really, really good. Without wanting to sound biased, no other resource we can find comes close*, and for your content to be truly evergreen, you need to be confident that a competitor won’t come along and make something better. I’ll come back to that problem later.

*Omniture has been renamed Adobe Analytics, Which means we’ll have to do some updating to the post to keep it evergreen. More on how to do that later on…

The ‘other’ kind of content

Content that isn’t evergreen can be classed as temporal content, in that it is time-related. Newspapers are a classic example of publishers creating temporal content; you want the most up-to-date content when you’re reading the news at your desk in the morning. There’s also plenty of other brands that rely on temporal content, such as fashion brands. A piece on the best summer fashion trends is a staple piece of fashion content, but there is no way to make that relevant more than three months from its publish date.

However, for most brands, temporal content is not a sustainable strategy for your content. The main problem is the strain on resources. The kind of newsworthy content that will gain attention is also the most likely to be covered extensively. There’s no allowance for your content calendar and other pressing tasks when a breaking news story hits, as the first-mover advantage will be gone very quickly, and the value of what you’re writing very suddenly diminishes. Even if this works, to grow traffic to your website, you need to create content at an increasing rate.

On the other hand, evergreen content allows you to grow traffic levels while producing content at a steady rate, as the model I’m about to show you proves…

The compounding returns of content marketing

Evergreen content, when made on a consistent basis, essentially starts to layer the traffic from different pieces on top of each other. For example, if you publish a blog post on day one, and it generates 100 views, traffic on day one for your blog equals 100. If you publish a post on day two, and it gets 100 views, and the older post also generates 100 views, traffic for day two is 200, and so it goes.

Tomasz Tunguz, a venture capitalist, has explored this idea in great depth and has done the best job I’ve seen of visualising an evergreen content strategy vs. a temporal content strategy – in the following two charts each coloured layer represents a single piece of content.

In the evergreen example below, you can see that ‘layering’ I talked about above. Even factoring in some decay, the results are obvious:

Source: Tomasz Tunguz

Conversely, temporal content looks a lot more ‘spiky’ with traffic levels fluctuating dramatically. Most significantly, in the following example, by publishing the same number of posts with the same number of first-day views, the temporal blog never gets above 70,000 visits per month (whereas the evergreen content model reaches just short of 200,000):

Source: Tomasz Tunguz

The above is a ‘perfect’ example, but you can see this on the Distilled blog too. ‘Google to Announce that Links are no Longer a Major Ranking Factor‘ is an example of a quite recent post that got high traffic levels when first published (16,500 page views in month one, most of those within one week or publishing). On the face of it, that’s better than the Omniture guide, which only got 6,800. However, if you look at the chart below, you can see that in the first ten months’ traffic for each post, and the value of evergreen content becomes more clear:

The evergreen post, shown in red continued to get consistent traffic every month, resulting in 84,000 page views by the time it reached month ten, whereas the temporal piece of content only received 18,600. That’s only another 2,100 over the next nine months. On a pure traffic level, the evergreen piece is much more valuable, because of the ‘compounding effect’ of evergreen content, as you can see below:

As I have already touched on, in reality, doing this in a repeatable way is never this simple, but there are a number of tactics you can add to your content strategy to help in creating evergreen content.

What stops evergreen content from lasting forever?

Even the best evergreen content generally sees some decline over time. One reason for this is the work of competitors. Yes, you’re going to try and make the best piece of content for your topic and purpose, but you can’t stop somebody else coming along and doing the same thing. Even if yours is ‘better’, you will lose some traffic to a worse piece of content just by the virtue of it existing. Additionally, when you get better at producing content, especially if you find your niche, you’ll also find that you have to compete with yourself as topics will have some overlap. Even the Omniture guide, which I’ve used as our best example of evergreen content is slowly starting to lose traffic.

Bearing all of this in mind, I’ve put together a few ideas of how to start positioning your content to earn and keep more traffic with an evergreen mindset.

Putting everything into practice

Taking the theory, and the pitfalls, of evergreen content on board, the final part of this post aims to turn the concept into a series of actionable steps to help grow your brand’s residual traffic. I’ve broken it down into four actions.

Ship it and then tweak

When you set out to make a piece of evergreen content, it can be tempting to try and write ‘War & Peace’, leading to a behemoth piece of content that attempts to be all things to all people. And that’s if you ever manage to publish it. So, narrow your focus, and create something that serves a specific purpose.

When creating new content, remember the phrase “Don’t let perfect be the enemy of good”, by which I mean put the effort into shipping it as soon as possible. Then, learn from the initial traffic and engagement, then focus on updating and repurposing, which brings us onto the next point.

Tip: Shipping and improving is essentially a pared down version of the minimum viable product methodology (MVP). Understanding this in more depth can really help you understand how to effectively measure and decide what to change.

Repurpose content

Despite best intentions, evergreen content will begin to age and start to give subtle cues to readers that it’s old or out-of-date, which will lead to a drop in traffic. Was the post published on a blog with publish dates displayed anywhere? If so, consider republishing the post when it’s updated so it is moved back to the top. This is especially useful when you publish lots of content; you can’t expect readers to land on your blog/resources section and casually scroll to page 43.

For the URL sepcifically, we’d never recommend including the URL, with the only arguable exception of content that you want to show in Google News.

There’s also an opportunity to update the post to reflect changes. The Omniture Guide is technically now outdated as the software isn’t called Omniture anymore, but now Adobe Analytics. This actually increased the monthly traffic for a while, as people were still searching for ‘Omniture’, but the official page for the product had disappeared from many of the SERPs, and so our post was pushed to the top spot. However, that was only temporary, and we now plan to update and republish the post.

Tip: If you plan on replacing the old piece of content with the new one, then be sure to either use the same URL as the older piece, or 301 redirect to the new piece.

You should also consider if you can reformat the content. You could turn a blog post into a video, a Slideshare, a webinar or an email series. There are plenty of options that are relatively easy wins in this scenario. For relatively low effort, you can capitalise on successful pieces with additional “launch” traffic and increased authority with links back to the keystone piece.

Use temporal content to aid discovery

While I’ve mentioned some of the downsides of temporal content in terms of long-term traffic, it can be very useful in aiding discovery of your brand and other content when it goes viral. To help make this tactic effective, simply make sure there’s a useful CTA on each and every piece of temporal content. That CTA should direct the reader to a piece of evergreen content within the same topic. This should help to both keep the reader on your site for longer and increase the traffic levels of the associated content.

Tip: If you use the CTA to effectively gain email signups as you can then target the reader with content on a regular basis.

Wrap it up in a content calendar

Regardless of your exact content strategy, a content calendar is a must to organise your publishing schedule and make sure your content team is on track. Among the many benefits, you can organise your repurposing/republishing schedule to keep a smooth flow of content landing on your site.

Rounding up

Evergreen content is simultaneously an easy concept to grasp yet difficult to put into practice. I hope this guide has been useful in explaining the value of evergreen content, and giving you a jumping off point to creating your own, or even upgrading what you already have. While the strategy will never be 100% bulletproof, it’s satisfying to see the resulting effect of compounding returns, as you turn a bunch of individual bits of content into a traffic-driving machine.

Source: Distilled

    

Finding a Happy Medium: Should You Use Medium for Your company’s Content?

By Lydia Gilbertson

You’ve likely read an article on Medium before, even if you didn’t notice the green “M” hanging out in the corner of your screen. The platform houses tons of popular content created by both its users and publications alike. A place for thoughtful, long-form, niche and technical content, it’s not surprising that everyone from CEOs to freelance journalists posts there regularly. Increasingly, businesses and publications have been using this platform for their content distribution. With all that in mind, is Medium the right way for your business to reach new readers?

What is Medium?

Thirty million users visit the content platform Medium.com each month. Founded by Ev Williams (the former CEO of Twitter), the platform-CMS-community hybrid has made a lot of changes in how it operates alongside both publishers and marketers. An increasing number of businesses are using Medium for their company blogs and many publishers are starting to host and distribute their entire content library through it. Like any platform, there are pros and cons for its utilization in a business’s digital marketing efforts. By examining the capabilities and options businesses have in Medium through an organic search marketing perspective, I hope to provide a deeper insight into if you should utilize Medium to improve your company’s digital content community.

Medium as a CMS

Managing your content on Medium is extremely simple. There aren’t a ton of different editing options, but it is all you need to publish written digital content effectively. Medium is a CMS without a million different plugins and add-ons, making the publishing experience fast and easy. Unlike WordPress, which has an endless amount of design themes to chose from, Medium allows for very limited on-page visual customization. This is certainly a con for larger sites, as it limits your use of visual branding elements that make your company unique.

Medium allows for Google Analytics integration to its sites, but it also has a limited amount of built-in measurement for your posts both individually and as a group in its CMS. It uses three metrics: Views, Reads and Recommends.

A graph showing the standard Medium analytics view.

‘Views’ refer to the number of users that clicked on an article. ‘Reads’ is the number of people who actually read the article; which, as far as we can tell, is calculated using the amount of time a user spends on the page and the estimated read time that is shown at the top of each article. ‘Recommends’ is the Medium equivalent of a share on its platform. The emphasis that the analytics metrics put on the time users spend on the page suggests that the Medium algorithm favors posts that people reasd in its entirety. This is one of the reasons the platform works so well for long-form, informative and academic content.

Medium as a content platform

Certainly one of the most appealing parts about utilizing Medium is its large network of engaged users. The company reported a 300% increase in users since last year and has continued to grow. Many large publishers and companies use Medium including The Ringer, The Awl and Signal v Noise (BaseCamp’s Blog).

Medium’s user-base is very focused on getting high-quality content that is not watered down. This is shown through Medium’s pivot in its business model earlier this year where it no longer offered advertising services for publishers. Their business model relies on putting more of an emphasis on content. So, if you’re a publishing site that gets most (if not all) of its revenue from banner ads, Medium may not be a wise choice for your business.

Despite that, Medium is a great platform for emerging publishers or startups’ blogs for this reason. Users actually read the content. The Medium team puts a heavy focus on the amount of time spent reading each post, and measures time reading meticulously, taking pauses and sidescroller movement into account. In 2016 its users spent an average 4.5 million hours per month reading on the platform. Its users can also subscribe to blogs, authors, tags or categories of posts that they like in their custom feeds. The essentially built-in audience that Medium provides to its publishers is a great platform for your site to jump off from. It helps to build your users in a quicker and easier way.

Case study: TheRinger.com

TheRinger.com was one of the first large publishing sites that began using Medium for their initial content distribution strategy. The Ringer is a product of the Bill Simmons’ podcast network and already has an audience following from its previous incarnation (Grantland). In this interview by Recode, Simmons expands on his media projects’ relationship with the Medium platform. He explains that initially, they utilized Medium as a way for them to maintain a website without spending most of their funds on development projects. This is another way small or startup companies can utilize medium as both a content network and a platform.

Custom domains vs. Medium-generated domains

Later, in the above-mentioned podcast interview, Simmons goes on to explain that even now that they can no longer sell banner ads on the site (they are funded entirely by podcast ads anyways), that they will stay on the platform a little bit longer. However, they intend to build their own site eventually*. This shows that many new websites that are looking to eventually ‘outgrow’ Medium, should opt to use a custom domain rather than a Medium generated domain (like, www.medium.com/site-name). Although depending on a platform to both hold and distribute all of your content is considered risky, generally. It is possible to move your content off of Medium and this would be much easier if you utilize a custom domain rather than Medium’s.

*On 06/01/2017 TheRinger.com announced that it would be moving to hosting from Vox Media. The site has not yet migrated off of the Medium platform.

On the other hand, if you plan on staying inside of the Medium platform, for much smaller companies and blogs, it could also be an advantage to utilize Medium’s high domain authority (92), to help boost your traffic in organic search results rather than switching to your own domain and starting from zero.

What if you already have a website?

All of the suggestions I’ve made so far about the utilization of Medium in your digital marketing efforts have been centered around newer sites. However, there are a few ways to get your content seen on its network if you already have a site that has a regular audience and a domain.

  1. You can republish your content on Medium. Medium allows you to canonicalize content you post onto the Medium network to your own website. This makes it so you get neither a duplicate content penalty or get outranked by your Medium posts in the SERP. Just, make sure that you are canonicalizing back to your site’s original post. You can import a story here.

  2. You can also migrate your site to the platform fairly easily. However, do not do this unless you are sure that its limited CMS and user-base is right for you. If you create a lot of long form informative content and have limited development help or need, migrating your site to Medium is an option to consider.

Medium and technical SEO

When I first started examining Medium from an SEO perspective, I was alarmed by the amount of URLs associated with each page. Both author and post pages use around eight URLs similar to, “https://theringer.com/@michaelweinreb?source=———1”. However, all of these extra URLs are for tracking purposes and are properly canonicalized, so there should not be duplicate content issues as a result.

Medium also has an interesting sitemap strategy. All of Medium’s sitemaps are auto-generated, as many platforms are. What makes Medium’s different is it generates a different sitemap section of the index sorted by date (see image below):

These dated sitemaps may also have a correlation with the priority many Medium sites seem to get when it comes to timely content in the SERPs, making them a sound strategy for news sites.

Takeaways

While using Medium for your blog (or your site) might not be the right choice in every situation, there are many ways that this platform can help you to grow an audience quicker or begin a content heavy site with minimal development startup. This platform is ideal for small startup businesses’ blogs and publishing sites just starting to grow an audience.

Like most platforms, it’s a risk to put all of your content into a separate companies hands, and given how much Medium has changed already, this is certainly something to keep in mind when choosing to serve your blog on it. Medium is a great place for long-form, educational and intellectual content, so if your blog of website fits into those genres, Medium is certainly a platform to consider.

Source: Distilled

    

What We Learned in May 2017: The Digital Marketing Month in a Minute

By Andrew Tweddle

Another month, another glut of digital marketing stories to dive into. The usual suspects are covered, from Facebook and its internal content policing rules; to Google adding a number of new features to various offerings. There’s also a great voice-search study from Dr Pete Meyers and the best from the folks here at Distilled.

Industry news

Snap misses Q1 earnings and shares nosedive

Snapchat has seen its shares plunge by 20% after missing targets for its first quarterly earnings as a public company, as well as seeing its growth in new users fall to its lowest level in years. This weak growth comes as Facebook pushes Instagram closer and closer to Snapchat’s territory. Snap Inc. made a $2billion dollar loss in the quarter, but CEO Evan Spiegel attributed this to stock-based compensation

Read the full story (Business Insider)


An eye-opening look at Facebook’s internal rules on violence, sex and terrorism

In the world of fake news and extremist content, it’s no surprise that Facebook has created extensive internal guidelines for moderators on how to deal with these kinds of issues. The Guardian lifts the lid on the secretive rules and exposes the major challenges and intense pressure that employees face when trying to police and make decisions on all manner of content.

Read the full story (The Guardian)


The lessons from 1,000 voice searches

In an amazingly meticulous piece of research, Moz’s Dr Pete Meyers carried out and analysed 1,000 voice searches on Google Home to try and get a sense of how the intelligent personal assistant was handling your queries. Among the many pieces of information uncovered is the strong correlation between snippets and voice answers, with 71% of queries with snippets also having a voice answer.

Read the full story (Moz)


Chrome will add ‘not secure’ warnings for HTTP sites later this year

It’s the final nail in the HTTP coffin, as Google signals its intent to give ‘not secure’ warnings to any HTTP sites later this year, meaning moving to HTTPS is more important than ever. The warnings will start to appear in October 2017 and will display when users enter data or visit on Incognito mode.

Read the full story (Marketing Land)


Wannacry ransomware largely fails to blackmail users

The recent ‘Wannacry’ ransomware attack in Europe, which affected 200,000 machines and was called unprecedented by Interpol, has largely failed to generate revenue for the attackers from its ransom demands. So far, only an estimated $92,000 has been collected, which is a meagre amount considering the $300 starting demand.

Read the full story (Bloomberg)


Google launches ads hub beta for cross-device tracking

Initially created to analyse YouTube campaigns, the ads measurement system created by Google now includes data from Google Display Network and DoubleClick. The expanded data available shows a shift from being reliant on cookies to user and device IDs. The system, called Data Hub is designed to help give impression-level insights for campaigns served across multiple devices.

Read the full story (Marketing Land)


A year of Google Maps changes

Taking a detailed look at the finer details of Google Maps, Justin O’Beirne drills down to the cartography and design choices used the by the search engine giant. From park walkways to displaying different business listings in multi-storey premises, Justin charts the recent changes and compares it to main competitor, Apple Maps.

Read the full story (Justin O’Beirne)


Biz Stone returns to Twitter

Twitter, evidently not afraid to turn to old founders (Jack Dorsey returned to become CEO in 2016), have re-recruited Biz Stone to help with company culture. Biz has made it clear that he’s not returning to replace anyone, with Dorsey’s mixed tenure as CEO coming under inevitable scrutiny.

Read the full story (Marketing Land)

Distilled news

SearchLove Boston 2017 has just drawn to a close. We were joined by 200 smart marketers and 16 amazing speakers. The videos for the conference will be available in about four weeks. Meanwhile, tickets are selling fast for searchLove London 2017, which takes place on the 16-17 October.

On the Distilled blog, CEO Will Critchlow has put pen to paper numerous times this month, writing about the end of the 1-Click Amazon patent, the NYT fluffing the Google monopoly argument, and most interestingly, the results of a JavaScript SEO split test.

Over at the Moz Blog, Will continued his JavaScript exploration, digging further into the state of JS indexing. Principal Consultant Ben Estes showed us his smarter methods for measuring and improving site speed, while Sam Nemzer walked through the method of implementing SEO changes using Google Tag Manager.

Source: Distilled

    

Early Results from Split Testing JavaScript for SEO

By Will Critchlow

We’ve been testing what happens when pages rely on JavaScript to render properly – and one of our first tests showed an uplift when we removed a reliance on JS:

When @distilled ran an SEO split test to remove a reliance on JavaScript, they saw an uplift in search performance https://t.co/JOmPXReGbq pic.twitter.com/7bhzHxV0qK

— Will Critchlow (@willcritchlow) May 25, 2017

As many of you know, at Distilled we believe that it’s increasingly important to be testing your hypotheses about what will affect your search performance. As digital channels mature, and as Google rolls more and more ML into the algorithm, it’s increasingly hard to rely on best practices. To make this easier, we have been rolling out our SEO split testing platform, Distilled ODN (Optimization Delivery Network) to more and more of our clients and customers.

As we get our platform deployed on a wider range of sites with different architectures and technologies, we’re able to start testing more and more of the assumptions and best practices held around the industry.

You can check out a bunch of case studies that we have already published on our site: structured data, internal linking and meta description, title and header tags – and you can find more details in this presentation (particularly slide 73 onwards) that my colleague Dom gave at a recent conference. We also included some teaser information in this post about a big win that added £100k / month in revenue for one customer even while only deployed on the variant pages (half of all pages).

One thing that we were excited to get to test was the impact of JavaScript. Google has been talking about rendering JavaScript and indexing the resulting DOM for some time, and others around the industry have been testing various aspects of it, figuring out when it times out, and finding out the differences between inline, external, and bundled JS.

The hypothesis: there is a downside to relying on JS indexation

I have a Moz post coming next week on how I believe that JavaScript rendering and indexation works at Google, but the very short version is that I think it happens in a separate process / queue to both crawling and regular indexing. I think there is a delay downside, and possibly even more than that. I’ll update this post with a link when my detailed hypothesis post goes live.

We recently had a chance to observe some of the effects of JS in the wild. One of our consulting clients – iCanvas – was relying on JavaScript to display some of the content and links on their category pages (like this one). Most of our customers on the ODN platform are not consulting clients of Distilled, but iCanvas is a consulting client with the ODN deployed (I’ve written before about how the ability to split-test is changing SEO consulting).

With JavaScript disabled, there were a load of products that were not visible, and the links to the individual product pages were missing (it’s worth noting that the pages showed up correctly in fetch and render in Google Search Console). We wanted to make the consulting recommendation that performance may be improved by showing this information without relying on JavaScript – but this is the classic kind of recommendation that is hard to make without solid evidence. There is clearly a cost to making this change, and it’s hard to know how much of a benefit there is.

Before our change, the pages looked like this with JS disabled:

After the change, they looked more like this (which is exactly how they used to look with JS enabled):

[It’s worth noting that although the change we made here was technically a CSS change, the test is measuring the effect of removing JavaScript dependence – we just moved a feature from JS-reliant to non-JS-reliant]

Split-testing the effect of JavaScript

Using our split-testing platform, we rolled out a change to 50% of the category pages to change them so that the information users might be looking for was visible on page load even with JavaScript disabled. The other 50% of pages remained unchanged and continued to rely on JavaScript.

We then automatically compared the performance of the updated pages with a forecast of what would have happened if we had not rolled out the change (this is called a “counterfactual”) [more here]. This showed that the pages we had updated to remove the reliance on JavaScript were getting a statistically-significant amount more traffic than we would have expected if the change had no effect:

The platform’s analysis showed a greater than 6% uplift in organic search performance to these set of pages, which amounted to over 3,000 additional sessions per month. This was an amazing win for such a small change (the chart above comes from the dashboard built into our ODN platform).

As an aside, the mathematicians on our team are constantly working on refinements to the way we detect uplifts with statistical confidence (see Google’s paper Inferring causal impact using Bayesian structural time-series models for more background). We use a variety of synthetic tests, null tests and cross-checked data sources to make improvements to the accuracy and sensitivity of the automated analysis. We also apply a variety of treatments to the analytics data to account for various behaviours (dominant pages, sparse traffic distribution, seasonal products etc.), as well as some modifications to how Google’s Causal Impact methodology is employed.

In the test above we have since improved the accuracy of the analysis (it did even better than the initial analysis suggested!), which is exciting. It also means we are capable of detecting tests that result in smaller uplifts than previously possible, helping lead to improved performance and improved attribution.

It’s possible that even when rendered, JavaScript hinders search performance

The main takeaway is that you should avoid the assumption that JavaScript-driven pages will perform well in search even if Google is able to render them. We need to continue running more JS tests, but in the meantime, we strongly recommend testing whether your reliance on JavaScript is hurting your site’s organic search performance.

Can’t run SEO split tests on your site? Get in touch to get a demo of our ODN platform:

Source: Distilled

    

Hey E-commerce Managers, Amazon Loses Their 1-Click Patent This Year

By Will Critchlow

To: Digital Marketing Manager, e-commerce clients

I wanted to make sure that you know that Amazon is losing their patent on 1-Click checkout this year. It’s always been amazing to me that they were granted this patent in the first place, but they were (in the US – the EU took the unsurprising view that it was too obvious to patent).

If you aren’t already, I strongly recommend speaking to counsel and the product team to build in a plan to launch a one-click option as early as you are legally allowed.

Background and legal fights

Not only was Amazon awarded the patent on the 1-Click checkout – they also successfully wielded it. Amazon sued Barnes & Noble in 1999 – a suit that was settled in 2002 – and although terms were not disclosed, it ended with B&N removing its single-click system.

If you feel like you’ve seen one-click checkouts in the intervening period, you’d likely be right; Amazon licensed the “technology” to Apple in 2000 for an undisclosed sum. Despite that, it’s been (in my opinion) a drag on innovation, and conversion rates, across the web now for almost two decades. So it’s great news that the patent is expiring this year.

What is one-click checkout?

Targeted at returning customers, the idea of one-click checkout is that you specify your preferred delivery and payment options. In return, any time you are signed in, you get a checkout button on all product pages that places the order immediately with your default options.

This is what it looks like on the Amazon site:

Obviously, this works best for sites that have a high level of both returning visitors who stay signed-in, and a high level of repeat purchases. If your site meets these criteria, there is likely an immediate conversion rate uplift to be had from implementing a one-click checkout that enables instant gratification and faster decision-making.

Over time, it is the kind of technical change that builds loyalty and makes your store more customers’ default for buying anything in your product category.

How much will this be worth to me?

Most teams will need to make a business case for the engineering work to build this into the product – once legal counsel has signed off. I would approach this with a bottom-up model – starting from repeat customers (if you aren’t already tracking repeat customers, here’s how you can do that in GA).

If you first look at how repeat visitors convert to repeat customers, you can build a model that takes a range of assumptions of the conversion rate improvement and spits out a figure for incremental revenue (which you can turn into gross margin with your own proprietary data).

I imagine that for most sites of a reasonable size, the business case will be quite compelling. It’s perfectly imaginable that you could see as much as a 15-20% uplift in conversion rates among returning visitors.

To see what this could be worth you to, you can find your returning visitor conversion rate in Google Analytics like this:

1. Apply the segment “Returning Users”:

2. Select the goal you are interested in to see the conversion rate of the returning visitors (here’s a screenshot from our analytics – showing a low conversion rate because it’s looking only at SearchLove ticket sales vs. all visits to all sections of the Distilled site:

In addition to the direct benefits of improved conversion rate among existing customers who already have their details saved with you, you might also see:

  • Some customers returning to your site more often – because they feel that it’s the easiest site to order from in your niche (this can happen even if they can’t exactly say why they feel that)

  • PR opportunities if you are one of the first in your space to ship a one-click checkout

As always, if you want to discuss any of this, or have any questions, drop me a line and I’d be happy to help.

Source: Distilled

    

The New York Times Makes Poor Argument for Breaking up Google

By Will Critchlow

At the end of April, The New York Times published an opinion piece that was widely circulated among my network entitled “Is It Time to Break Up Google?”. I saw a lot of people excited at the idea, and the concept of a monopolistic Google obviously resonated with many. Unfortunately, I thought that it was a poor article and that there was not nearly enough critical thinking being applied.

When it was shared on an email list I’m on, I hammered out a rant explaining my issues with the article. This post is an attempt to clean that rant up into a reasonable state for wider discussion. I’m sure there are a bunch of areas where I am also wrong, and I’d love to hear others’ opinions.

I should start by saying that I am generally sympathetic to the idea that Google is dominant in search advertising, and is potentially abusing that power. I think that there’s a good chance there needs to be some kind of intervention (and indeed I’ve written a few articles about it – since as far back as 2007 1, 2, 3).

But I still didn’t like the article, and here’s why:

It bundles tech companies together while making weak cases for them being monopolies

The article starts by highlighting the size of the tech behemoths that have become the biggest companies in the world. Apple, Alphabet (the parent company of Google), Amazon and Facebook have joined Microsoft which was the only tech company in the top five as recently as 2007.

The largest of them all (Apple) is very clearly not a monopoly. Apple is only the sixth-largest computer manufacturer, and even in the high-end (if that is a distinct market) they are likely below 50% market share (they were around 30% in 2007). Smartphones is Apple’s biggest market – in this market they are the largest player, but with less than a 20% share and less than a 60% share of the high-end market (if that is distinct). You may be able to make a case for them monopolising the tablet market, but only because it’s a declining market generally in which other players have declined faster, and iPads make up only about 7% of Apple’s revenue. None of this is addressed.

Instead, we get weak arguments for Google, Facebook, and Amazon being monopolies. The author says:

Google has an 88 percent market share in search advertising, Facebook (and its subsidiaries Instagram, WhatsApp and Messenger) owns 77 percent of mobile social traffic and Amazon has a 74 percent share in the e-book market. In classic economic terms, all three are monopolies.

Now, I was really pleased to see search advertising identified as the market in which Google is dominant. Too many people talk about “search” share – but while that is interesting in some analyses, it’s not a “market” in the antitrust sense.

The arguments for the other tech behemoths as monopolies misses the mark, though. Monopolies need a market to monopolise, and:

  1. There’s no way that “mobile social traffic” is a market you can monopolise. Mobile advertising would be an interesting market – but Facebook isn’t a monopoly there (Google also has a huge share)

  2. You might get away with claiming that “e-books” is a distinct market (and that Amazon has a monopoly over it) but (a) I’m not sure — I think the relevant market is “books” and (b) if e-books is the thing, it’s a tiny part of Amazon’s business (it’s hard to find concrete numbers but this article estimates ~$2.1bn in the US, so the global number is probably a few % of the total ~$130bn of revenue). Regardless, it’s certainly not the reason it’s a stock market behemoth. Amazon is not a monopoly in any of its interesting markets (retail, even books, AWS, streaming content etc.)

It’s possible that some of this confusion stems from the fact that the author appears to be conflating “monopoly” with “big” – highlighted best by the statement that:

We need look no further than the conduct of the largest banks in the 2008 financial crisis

None of the banks were monopolies.

Poor arguments about innovation and “reallocation” of money

The author claims that:

It is impossible to deny that Facebook, Google and Amazon have stymied innovation on a broad scale

Now. I think it is quite possible that they have stymied innovation, but “impossible to deny” is far too strong – especially when coupled with a complete absence of any kind of argument or evidence. I can certainly see specific areas where each of them may have harmed innovation – but they also each have caused or enabled huge innovation on the other side of the ledger. Just look at the innovation that AWS alone has unleashed on a “broad scale” as one example – and that is just Amazon.

The article continues by saying that:

Billions of dollars have been reallocated from creators of content to owners of monopoly platforms

Leaving aside the euphemistic use of “reallocated”, this is still a weird claim, because I think it’s pretty difficult to pin this on an abuse (or even use) of monopoly power. Newspapers didn’t lose advertising revenue to Google because Google abused their monopoly. Search adverts are simply better than newspaper adverts. It’s almost perfectly backwards – Google got their financial dominance because their adverts were effective – they didn’t use their financial dominance to stop people buying adverts in newspapers.

There are almost no arguments for the headline’s proposal

Of the 21 total paragraphs, the author spends nine arguing for regulation rather than break-up. This is supposedly based on Google and Facebook being natural monopolies – though there is no argument put forward about why this should be the case. It’s not at all clear to me why it would be in the public interest to have only one search engine and one social platform and to regulate them as utilities.

The author suggests three other non-break-up interventions:

  1. Prevent future acquisitions – but this does nothing at all to address the issues the author has been railing against – Google remains dominant in search advertising regardless of whether they can buy Snapchat(!) or not

  2. Regulate as a public utility – this appears to be based on the claim that these big companies are natural monopolies – though there is little argument made as to why that should be the case. The interventions are even weirder – I’m not even going to get into how it’s very much not patents preventing other search engines competing with Google

  3. Remove “safe harbor” protection – probably the weirdest idea of all. For one thing, this appears to be targeted at YouTube, a facet of Google unmentioned up to this point (and he’s surely not arguing that Google is a monopoly provider of internet video / internet video advertising, is he?). Even if you consider it at face value, it’s hard to see how this would do anything to change their position as a monopoly provider of search advertising

Finally, in one brief sentence, we get to a single mention of the headline idea – of breaking up Google. So how should it be done? Which bits should be broken off from the main entity? Apparently they should be forced to sell DoubleClick – the display ad platform. How on earth is that supposed to remedy Google’s dominance in search advertising? This does nothing to undo or prevent any of the harm outlined earlier in the article. So we’re left with a weak argument for doing a thing that won’t help anyway.

Suggested reading

For a clearer view into why big tech companies are in danger of becoming monopolies, and the ways in which US antitrust law (in particular) is poorly-equipped to deal with them, I highly recommend these articles from Ben Thompson of Stratechery (the only paywall I currently pay for – though these are all outside the paywall):

  • Aggregation theory – explaining how the current tech environment results in huge companies with massive power

  • Manifestos and monopolies – a deeper dive into Facebook in particular (and more on the specific downsides of a Facebook monopoly)

  • Antitrust and aggregation – explaining why US antitrust law is poorly-equipped to handle these monopolies because it focuses on direct harm to consumers (which is less clear here) rather than harm to competition (which is much easier to prove). EU law focuses on the latter – so that’s my best bet for some kind of action – that we’ll see it come out of Europe.

Do we need regulation?

I’m also somewhat sympathetic to the idea that tech monopolies get toppled by new tech faster than by governments (it wasn’t the DoJ or the EU that broke the Microsoft OS monopoly, but rather the internet and mobile).

If we do need intervention, I don’t know what a good intervention would look like for Google, but for all the reasons above, I do think this NYT article is full of bad arguments and bad interventions.

I’d love to hear your thoughts in the comments or on Twitter.

Source: Distilled

    

What We Learned in April 2017: The Digital Marketing Month in a Minute

By Andrew Tweddle

A lot has happened in April in the digital marketing world; so much so that it’s been hard to distill everything down to one blog post. The fake news and extremist content sagas rumble on, while The Guardian shows us that AMP stands to be a major part of publishers’ traffic drivers. There’s also Twitter’s new progressive web app (PWA), which shows the level of functionality you can begin to expect from the new software methodology.

Industry News

Google cracking down on fake news:

Danny Sullivan has called fake news and misleading or abusive answers Google’s biggest ever search quality crisis. This month we saw them launch a coordinated effort to address the problem with a combination of technology, policy, and people changes. Danny has a detailed write-up of the so-called project owl – it’s worth a detailed read.

One of the angles of attack is updated guidelines for human quality raters, and in light of that, it’s interesting to read this dive into the life of these people behind the machine.

Read the full story (Google Blog)


The Guardian is getting 60 percent of its Google mobile traffic from AMP

While many publishers have, to varying degrees, scrambled to adopt Google’s Accelerated Mobile Pages project, The Guardian is an example of just how powerful it can be if employed in the right way. As of March, over 60 percent of Google mobile traffic directly from AMP, and the AMP pages in question are 2% more likely to be clicked on than regular mobile pages.

Read the full story (Digiday)


Facebook hiring 3,000 moderators to manage online content

In recent times, Facebook has come under fire for failing to remove extreme content from its site quickly enough, and conversely for removing other content unnecessarily. To help combat this, the social media giant is adding 3,000 new employees into the 4,500 strong community operations team.

Read the full story (The Guardian)


Alphabet reports $24.65 billion revenue in Q1

Google’s parent company Alphabet recently announced record revenue of $24.75 billion in the first quarter of 2017. That’s higher than many analysts predicted, representing a 22% increase year-over-year.

Read the full story (Marketing Land)


Google and Facebook bring in one-fifth of global ad revenue

Google and Facebook have strengthened their hold on the global advertising business by attracting a combined $101 billion in ad revenue, over 20% of the total amount. Google enjoys the lion’s share earning more than three times that of Facebook. Comcast is the largest traditional media owner, with revenue of almost $13 billion.

Read the full story (The Guardian)


50% of page one results on Google are HTTPS

As Google continues to make clear that they want all sites to move from HTTP to HTTPS, the commercial effect of this is starting to become evident. Only nine months ago, just 30% of page one results tested by Dr. Peter Meyers from Moz were HTTPS. That percentage has now risen to over 50%. We assume this is because of sites migrating, rather than a ranking factor change.

Read the full story (Moz)


Twitter launches progressive web app Twitter Lite

To combat the problem of poor internet connections and limited storage capacity, Twitter has launched a new Twitter Lite progressive web app (PWA). Twitter Lite gives users timelines, Tweets, direct messages, notifications and more. Additionally, a data saver mode helps reduce mobile data up to 70 percent, so that you can preview images and videos before loading them. Even on good connections, it’s interesting to see a PWA with so much functionality.

Read the full story (Marketing Land)


Google Home can now distinguish up to six different users

Google Home can now recognise the voice of up to six different users on one device, meaning the digital personal assistant can offer customised and personalised results to each different person. The feature is available from today in the US and will soon become available in the UK and rest of Europe.

Read the full story (Search Engine Land)


Google tests Google Hire, its own jobs tool

While it’s still unknown whether it will take on LinkedIn or Greenhouse (or something else entirely), Google has debuted a holding page for its new job site. For now, signing up to hear more is about all you can do, but expect more in the coming weeks and months.

Read the full story (Mashable)

Distilled news

On the Distilled blog

This month, the main focus on the Distilled blog has been the subject of brand awareness. Senior Consultant Tom Capper discussed the ways to measure the often vague idea of brand awareness, and Senior Designer Vicke Cheung looked at the topic from a content creation view, showing you how you can start creating your own content for brand awareness. Also on the blog, Consultant Ore Oduwole outlined the process for approaching the dreaded site migration.


Over at the Moz blog

Speaking of migrations, Principal Consultant Jono Alderson is calling for a better approach to migrations, including improving the very way we define them. Analyst Lydia Gilbertson provides her best tips for small publishers to grow their network with little resource, while Consultant Sergey Stefoglo lays down the basics of faceted navigation. Finally Tom Capper tackles the topic of links. Are they yesterday’s ranking factor, or do they still have a place in an SEO strategy?


Also at Distilled

We’re now just one month away from SearchLove Boston, and the full schedule is now live. You can see it in full here. If you pick up your ticket by 11th May, you’ll still save $200 in the early bird sale. Finally, Distilled CEO Will has been speaking to Expert Market about leading a company and the challenges of thought leadership:

Source: Distilled

    

Boost Your Brand: How to Create Content for Awareness

By Vicke Cheung

Historically, when digital marketing agencies venture into content creation, it’s with the hope of increasing the number, or strength, of backlinks. This post however, aims to challenge you to think more broadly about the value of your content.

Why?

We, as digital agencies, are not traditional advertisers. Yet global ad agencies are increasingly edging into our field of play, for example, producing integrated campaigns with far-reaching social effects. There is clearly a demand and need for this type of work. But on top of this, it may well be time to use your content for more than just a link-generator anyway. At SearchLove San Diego this year, Tom Capper presented a balanced argument for why it’s possible that links are becoming increasingly unrepresentative of how Google choose to rank your pages. One of the key takeaways is that in order to continue getting in front of our audience, we will probably have to start winning at brand awareness and perception. So, how can we do this with creative content?

1. Learn how to differentiate between content for link-building vs. content for brand-building

Typically speaking with link-driven campaigns, it doesn’t matter whether or not the people engaging with a piece of content remember the brand behind it. At the end of the day, it’s down to whether or not publications decide to cover and link to it. And actually, this oftens means downplaying the brand presence, so as not to deter coverage. For example, this can mean hosting a piece on its own page without company headers and footers, keeping the client logo small and unobtrusive, and generally not having to adhere to their usual brand guidelines. Because of this, it also means that the theme of the content can be quite tangential to the brand. For example, for Magic Freebies, we created an online ‘Spot the Christmas Movies‘ quiz. This piece bears little direct connection with their brand and offering – a site which compiles free samples and free competitions – however we judged that it was something which would peak the interest of their target audience and publications. I would consider this as content for link-building.

Source: Magic Freebies

Pieces for brand-building on the other hand require a much stronger connection between the content and the brand in question. However, unlike traditional advertising, this is not necessarily about parading a product or service in front of your audience. For example, with Rasmussen College – a private college in the US – our work for them was not about directly selling their courses, instead the strategy was to create pieces, e.g. “The Healthcare Career Matchmaker” shown below, which would help them become an authority in the career and education sector. The content aimed to be engaging and highly relevant, and therefore brand-focussed. If you go and explore the wider Rasmussen site, you’ll even notice that it sits within their usual site framework and the colours and font follow their brand guidelines too.

Source: Rasmussen

As far as targets go, it’s pretty obvious that the aim of a link-building campaign is getting links! This is a pretty straightforward and easy-to-track metric. With brand-building campaigns, though, ‘getting more brand’ isn’t exactly a thing. So what are the tell-tale targets that distinguish a piece as one that’s created for purposes beyond clocking up a site’s link count? That is precisely what the next section of this post is aimed at uncovering.

2. Think big, but start small

The trouble with content for brand-building is that because it has the word ‘brand’ attached to it, it immediately brings to mind associations such as brand awareness, brand sentiment, or brand trust. These are all things which a campaign could, and should, result in improving. This is the ‘think big’ part. But without a hefty sum to inject into expensive surveys and polls (which is mainly what advertisers rely on) it’s very tricky to arrive at any meaningful results if all you focus on is ‘increasing brand awareness’ – what even is the proof of success?

And so I would encourage you instead to start small. You can do this in a number of ways. For example, create a campaign which focuses on achieving one of the following:

  • Grow your social following.

  • Build a Facebook retargeting pool.

  • Drive targeted traffic to a specific part of your site.

These are all micro-conversions. They are the small things which happen in the lead up to a full conversion. More importantly though, what’s great about these is that they can all be easily tracked and assessed.

Oliving the Life is the perfect example of this. The campaign itself was created to promote a new range of healthy cooked meats from Hans Smallgoods, an Australian consumer brand.

Source: Oliving the Life

The challenge, as outlined in their case study, was to ‘increase brand awareness’. Despite this, they set their sights on measurable goals. Of the subsequent results that were published, all of them hit a specific and concrete micro-conversion:

  • Reached more than 6 million Australians and was engaged by 2.6 million targeted users.

  • Generated 1.78 million Facebook video views – 198% more than the projected figure.

  • Produced more than 93K engagements (likes, comments shares).

  • Average time on site was 1min 21sec, with a bounce rate of 34%.

  • Created a retargeting pool of 924,000 of engaged users who can be retargeted in the near future.

Another example is the following video campaign by Shutterstock.

Most people who know Shutterstock will think of them as a paid stock image library. A lesser-known part of their offering though is their extensive range of stock video footage. Which is why to combat this, they created a series of videos, exclusively using assets from this library. Not only was there a high engagement rate with the videos (165,000 views in the first couple of months on Youtube alone), but this then inspired 4000 clicks specifically onto the footage section on their main page. A result that Kashem Miah, the global director of social media and content marketing at Shutterstock, was very pleased with. He was later quoted as saying ‘There is definitely a business case that we have made for these videos and why we’re pushing them so hard…the campaign is an opportunity to showcase what’s possible with Shutterstock.’

The bottom line is that it’s far better to start small and focus your efforts on micro-conversion-based targets, and trust that in doing so, over time you will inevitably influence people’s perception and awareness of your brand. Note that I say ‘influence’ rather than ‘improve’ – this isn’t a given. The last thing you would want to do is to foster negative sentiments towards a brand! To combat this, you have to be sure that whatever campaign you’re attaching your brand’s name to is worthwhile and is something people will genuinely care about.

3. Consistently deliver value to your audience

If someone has been generous enough to spend time with your content, it can’t just be beneficial to you, it has to be valuable to them as well. But how do you judge whether you’re consistently delivering value? This is where the following framework comes into hand.

This triangle represents the types and the corresponding amount of content a brand should be aiming to produce. The link-building creative campaigns we’ve become known for at Distilled tend to fall into the ‘Wow!’ bracket at the top of the triangle (more on this below). However, in order to deliver consistent value, with content that is always relevant and available, we need to consider content beyond these bigger hero pieces.

Starting at the bottom, there is the ‘How?’ category. It basically stands for: How can you help your customers? This type of content makes up the base of the triangle because it should form the bulk of your offering that’s available 365 days a year. Think of it as laying down the foundations. One of the most obvious ways of offering value to your audience is to pin down the useful things they want to know and delivering on it. Figure out what questions they might ask, related to your brand, and create content which answers them. This can be in as simple a form as a blog post or resource page.

‘Now’ content comes next. This is about offering your audience things where the value lies in its topicality, and rewarding their engagement by keeping them up to date with the latest trends, news or data. Of course, this isn’t just about repurposing the news; it has to be relevant to your vertical in some way. If there’s an interesting news story or emerging trend where your brand can contribute to the conversation in a way that’s relevant this is the time when Now content can be really powerful. It can demonstrate that your brand is on the pulse and dependable. Because of the nature of this, there will naturally be less of this type of content than ‘how’, since you’ll be publishing this as and when something is topical.

Last but not least there is ‘Wow!’. As mentioned above, these are closer to the bigger, hero-type pieces, and furthermore, it’s rather self-explanatory from the name what this category is about. The value this type of content offers is that it’s entertaining and wows your audience. It is typically characterised by being more visual, e.g. in the form of styled infographics and quizzes. It’s positioned right at the peak of the triangle because these pieces can form as little as 10% of your overall content offering. Our benchmark at Distilled is to schedule in no more than 6 of these types of pieces a year for our clients. Because of this, you have more time to experiment with new formats and create truly sensational things.

Using the above framework, the job site – Monster – were able to grow pageviews on content by 18 million and increase the number of people searching for jobs through them by 28% in just over a year. You can read more about their success story here.

To conclude

Once you’ve mastered the above, measuring the success of your campaigns, based on whether or not the brand awareness needle has indeed moved, is a whole art in itself. But it is one which you should get to grips with, in order to prove the worth of your content, and make a business case for continuing to do more. To learn more about this, Tom Capper’s summary of how to get started is worth a read.

If you’d like to find out more about the points discussed in this post, my slides from SearchLove San Diego – Success Beyond Links: How to Make Your Content More Valuable – go into more detail. The full video of the presentation is also available to DistilledU members. If you have any further questions or comments, I’d love to hear from you below.

Source: Distilled

    

The Hitchhiker’s Guide to Site Migration

By Ore Oduwole

“Towel day” by Alan O’Rourke

In online marketing, site migration is usually a phrase that makes SEOs, PPCers, site owners and stakeholders wince. We’ve all heard the horror stories about sites that have migrated from one domain to another and experienced a huge drop in traffic and visibility, and those that have suffered the same fate just by changing protocol. Whether you have acquired a domain, want to roll up your M-Dot site into a responsive design or desire a move from HTTP to HTTPS, devising a solid action plan to avoid traffic and revenue loss is paramount. On this journey, we’ll cover some of the most important things to address pre, during and post migration to give your site the best chance of a smooth transition to your new destination.

Pre-pre-migration: migration types and considerations

So you are staring out into the abyss thinking what now? Before you rush off into the unknown, let’s start with the basics; migration types. There are many reasons you may want to migrate your site, but the most common reasons include:

M-Dot Roll-Up migration

The main reason for this type of migration is for a transition from a separate mobile and desktop site to a fully responsive website. Moving towards a responsive site helps to consolidate site authority and reduce development resources (due to having only one site to update).

HTTP to HTTPS migration

An HTTP to HTTPS site migration is one that is becoming more common. This type of migration is one where the domain remains unchanged but an SSL certificate becomes associated with the site. This certificate is a symbol of a safe and trustworthy site and explains Google’s push for domains to adopt the protocol. Google have recently added to this by displaying the word ‘secure’ on all HTTPS pages in the search bar, and have hinted that the phrase ‘non-secure’ may feature on HTTP pages in the near future.

ccTLD to TLD migration

This type of migration involves moving a country specific TLD (Top Level Domain) to a more internationally recognised TLD (a well-documented example of this was the Guardian moving from .co.uk to .com). A move from ccTLD to TLD can be great for users who have a much wider audience outside of their country specific location – due to reduced resources needed to manage each branch separately.

Rebranding/ consolidating multiple domains migration

Rebranding is a type of site migration which occurs due to a change of name or brand acquisition. Like ccTLD to TLD migrations, this can involve moving a single domain or migrating multiple domains into one. As expected, involving multiple sites in a migration leads to greater risk for traffic and visibility.

No matter which type of migration you are looking to do, each comes with a shared list of do’s and don’ts:

Do understand that looks aren’t everything

With the prospect of a new site comes the excitement of building something visually stunning. Do put your flair on the new site, but make sure this doesn’t come at a usability or SEO cost.

Do consider wider channels

Site migration has a big impact on multiple digital channels and is sometimes overlooked:

  • SocialThis involves thinking about social platform bios, logos, names, trademarks and brand tone of voice. It should be relatively simple to update a social media account without interrupting ‘regular’ behaviour, but if you need users to take action, give them notice and follow up with regular ‘countdown to launch’ reminders.
  • PPCIf you run Adwords paid search activity then this is a biggie. Make sure you update your final URLs to reflect the URL changes you have made in the migration plan – the last thing you want is your ads sending users to broken links or being disapproved entirely. Also, don’t forget to adopt the same tracking codes/UTM parameters to ensure that there is no break in reporting.
  • OfflineIf your migration involves a name change you may want to take out advertising on billboards, local press (and beyond depending on your offering). Also, don’t forget to look back at previous domain campaigns/products that may have been supported by vanity URLs – be sure to 301 redirect these to your new site if they are still valid/ have garnered links and social shares.
  • SEOOrganic traffic is likely to be impacted most adversely in the short run as Google makes sense of redirects and page changes that have been made. As a backup, it is useful to allow extra budget to invest in alternative traffic sources in this period such as email or PPC spend for keywords that you rank well for organically.

Do get the timing right

Timing is everything. In the planning stage it is vital to choose the best time to migrate by considering the following questions:

  • Is your site affected by seasonality? If so when are these peak periods?

  • Who are the core team members that will be involved in the site migration? What is their availability?

Take advantage of your analytics data to understand your business traffic patterns, and Google Trends data to understand overall user demand within the market. Plan to migrate during a quiet business period where ample staff resources are available.

Do set the tone

As hard as you try migrations don’t always go to plan, therefore it is recommended to manage expectations early:

  • Agree on migration objectives (why are you migrating?)

  • Be clear about the amount of time and effort a migration takes and the tight deadlines needed to keep the project on track

  • Outline what is expected from everyone involved in the migration and the impact of these expectations not being met – make sure that a migration plan is developed and communicated across all teams involved early on

  • Be clear about the potential impact of migration in both the short and long term and the average length of time needed for ranking recovery (this will be longer if site design, URL structure etc.. are compounded)

  • Share migration case studies with clients if you have them

Do agree on reporting format and frequency

Agreeing on what will be monitored and reported will provide you with accountability and makes sure that you are aware of what is important to measure; and what data is important to pull before launch. From experience, clients often request weekly ranking reports with keywords divided by category type (determined by the site), and page speed insights within the first few weeks of launch.

Although this list of dos and don’ts is important, it isn’t exhaustive. It is strongly recommended that a thorough technical audit is performed before migrating. This allows you to identify any issues that should be resolved to prevent trouble down the line. If you are aware that your site is in particularly poor technical standing or suffering from a penalty, it is advised to delay your migration plans until these issues are resolved.

Pre-migration: fuelling the rockets

Now we’ve thought about what’s in the abyss, it’s time to arm yourselves with information before you get out there. It is vital to obtain as much URL information about the legacy site as possible, for tracking, benchmarking and URL mapping purposes. This can be gained by exporting data from the following sources:

  • Sites Analytics Platform – Export a list of every page that has received at least 1 visitor in the last 12 months. This ensures that all traffic driving pages are accounted for ready for the URL mapping process.

  • Buzzsumo – Export a list of all your most shared content. This is a great way to ensure that content that users have engaged with and continue to engage with are accounted for.

  • Screaming Frog/ Deepcrawl – Run and export a full crawl of the legacy site to gather a list of every URL that may need to be mapped. (If you have a separate M-Dot site or subdomains that you are looking to move, don’t forget to include these in the crawl).

  • Moz’s Open Site Explorer/ Majestic/ Ahrefs/ Google Search Console – from these tools, export a list of each legacy URLs that have external links pointing to them. By using each tool you can ensure that you are casting the data capture net wide as wide as possible, given that each tool collects backlink data differently.

  • AdWords – Export a full list of URLs you are using for your PPC campaigns. If you have PPC specific URLs, ignoring these could lead to broken links, a significant drop in quality score and even mass ad disapproval.

Once you have exported this data, it is time to combine lists, remove duplicate URLs and prioritise the most important URLs for redirection. This can be done using programmes such as GDoc and Numbers (for Mac), but for the speed of processing large volumes of data and the ability to easily group, de-dupe and order, Excel is the preferred choice. Next, create a list of URLs for the new site. When you have a list of unique legacy site URLs ordered by importance, a list of planned URLs for the new site, it’s time to create your URL redirect map.

Map each legacy URL to the new site URL on a 1:1 basis (where possible) rather than blanket mapping to the homepage or category page, and ensure that this is done via a 301 redirect – given that you want to let search engines know that the redirect is permanent. With some migrations, there are an enormous amount of URLs that need to be mapped. If this is the case look out for opportunities to use formulas and regular expressions to make the task lighter.

Once you have created your URL map for the new site, it’s time to benchmark your legacy site. This will make it easy to measure current performance against your new site. Make a record of the following on your legacy site:

  • Site speed of the top traffic/revenue-driving pages using tools like Pingdom, GTMetrix or Google PageSpeed Insights.

  • Rankings for your keywords (this does not need to be exhaustive, although it should contain your most valuable keywords and be spread across the products/services you offer). In order to effectively monitor keyword behaviour and patterns after migration, be sure to group similar keywords together in ‘category’ type groups.

  • Organic traffic and conversions per page.

Now that you have your most important data, and your new domain confirmed:

  • Create a robots.txt file to dictate which areas of your new site search engine spiders can access. Areas that you don’t want crawlers to reach should be marked with ‘disallow:/folder-on-site/’. An example of this can be found in Google’s robots.txt file.

  • Create an XML sitemap for the new site.

  • Register and set up the new domain in Google Search Console.

  • Create a useful 404 page to help users that reach a broken/ non-existent page find their desired destination on site.

Ok so now you are ready to make the journey right? Not quite yet – the abyss can be big and scary, so I would recommend performing a test run.

If you aren’t using a staging environment to test site changes it is highly recommended that you start now. A staging site is a great way to mess with settings pre-launch to understand the full effect of the changes made. Just make sure that it is either blocked in robots.txt and/or all test site pages have a noindex tag on them. Once this is done use the staging site to:

  • Test every 301 redirect from the legacy to the new domain.

  • That URLs present the expected information (e.g meta descriptions, H1 tags, title tags).

  • That internal links present 200 status codes and there are no broken links present.

The migration: launch

Finally, you’ve finished you rigorous testing, you’ve set up your monitoring tools and everyone and everything is in place for the big button push – launch that site!

  • Launch! – Publish content to the new domain and ensure that there are no internal broken links and pages are displaying as expected. Apply the 301 redirects from the legacy domain to the new domain.

  • Crawl Legacy URLs – Using Screaming Frog upload your legacy URLs in list mode and crawl to ensure that all pages are 301 redirecting. If this isn’t the case, review any non-200 status code pages manually.

  • Update robots.txt file – Remove the disavow rule in robots.txt/remove the noindex tag from pages where applicable, in order to open up the relevant pages for indexation. Remove password authentication if extra precautions were taken.

  • Tracking code – Check that all tracking code put on the site (analytics, retargeting, AdWords, Google Search Console etc.) are triggering and collecting data as expected.

  • Notify Google of site change – If the only change occurring is the protocol (from HTTP to HTTPS), or subdomain name change, this step will not need to be taken. In all other instances, it is imperative to notify Google via the existing Google Search Console domain as soon as the migration is launched.

  • Fetch as Googlebot – Make sure that your homepage and any other important pages are accessible to Googlebot and display content as expected. You can ‘fetch’ through the following path: Google Search Console > Fetch as Google > Enter URL > Fetch and Render > Submit to Index. Although other search engines will pick up changes from Googlebot, it is advised to follow the same process with the designated webmaster tools for each search engine that contributes a significant amount of site traffic.

  • Real-time – Using Google Analytics (which you should be, even if you use another analytics platform) monitor the real-time feature to view the drop in users to the legacy site and the rise in users to the new site.

  • Review and Upload Sitemap – Check that the new site XML sitemap is as expected and that URLs are returning a 200 response code when ran through Screaming Frog in list mode (if errors occur address each URL respectively). Once this is done, via Google Search Console, upload the legacy and new XML sitemaps through the following path Google search Console > Crawl > Sitemaps > Add. Uploading both the new and legacy sitemaps will aid crawlers to identify the new desired page and understand that legacy URLs have been redirected. As above this should be done in all webmaster consoles that contribute a significant amount of traffic to the site.

Post migration: fighting the baddies and taking it home

You’ve thought about the journey, fueled your rockets and now you are in flight. You are landing soon and you want to make sure you glide rather than fall out of the sky. Depending on the strength of your site, backlink profile and social clout, Google will begin crawling your site quite quickly, however, new pages entering the index will occur over time. Regularly check search engine caches for important pages such as the homepage and top level category pages to identify when the new URLs/page content are indexed.

Google Search Console checks

In the days after migration, Google Search Console makes it easy to monitor a site migration including; messages and crawl error reports:

  • Alerts and messages – Check the Google Search Console inbox daily for any alerts or error messages that need to be addressed.

  • Indexation – Compare the number of submitted URLs to the number of indexed URLs according to Google Search Console. These numbers may not be close together in the first few days, but if this is the case after this period, there may be errors that need to be addressed.

  • Crawl errors – Be sure to check the crawl error report daily for the legacy site and the new site. Within this report, it is important to pay attention to the date the error appeared and compare this to the date any changes were made. If you believe that the errors in the report have already been identified and resolved mark all errors as fixed. If they are still an issue, the error will return and it will be clear what needs to be addressed.

Screaming Frog crawl

Beyond Google Search Console, Screaming frog is a great tool to monitor status codes, redirect chains, tracking code and more. Using the tool, perform a crawl of the legacy site URLs to ensure that:

  • There are no temporary 302 redirects, or redirect chains present

  • No real pages return a 404 status code

  • Tags and meta descriptions have been migrated as expected

  • Analytics tracking code is present on all pages (use the custom extraction feature to identify this)

  • No pages that you want to be indexed are being blocked by robots.txt or meta robots tags

Update online properties

Make sure to update social media properties to reflect the site migration, even if redirects are already in place. It may also be beneficial to update Twitter handles and brand pages. Both SearcEngineGuide and Moz provide helpful social rebrand guides for all the major social platforms.

Update your site’s most valuable inbound links

Where possible it is strongly recommended to contact the owners of sites that link to yours where the URL has changed. Although a redirect will already be in place, a linking root domain updating their link directly to the new URL will remove undesirable redirect chains and ensure that the maximum amount of link equity is passed to the new page. More often than not, the sites will appreciate the update. Use the data pulls collected from the pre-pre-migration stage from Majestic, Ahrefs, Google Search Console and Moz’s Open Site Explorer, identify your most valuable inbound links and reach out.

Build new links to your site

It is important to build new links in order to replace some of the link equity lost from 301 redirects, and to create new paths for search engines to discover in order to crawl your site. As always, this is best done by creating content that is informative, relevant and useful. Evaluating the existing content you have via what performs well in terms of visits and engagement, and grouping these using a content matrix can help determine your next move.

Tracking and benchmarking

Once the new site has launched, it is time to monitor and report on the impact of your changes:

  • Compare site speed and usability of the legacy site vs. the new site for the legacy site’s most valuable pages based on the benchmarking data collected earlier.

  • Using your chosen ranking tool, monitor your pre. Vs. post-migration performance on a weekly basis. As tempting as it is; try not to draw any conclusions on positions for at least 4 weeks. It can take a while for Google to completely understand the migration that has taken place and this is compounded by the size of the site among other factors. Eventually, rankings should recover around the same positions they were previously. As it is quite common for rankings to drop before recovering, it is stressed early in this post to be transparent with clients so that there are no nasty surprises.

Final thoughts

Just to wrap up for those of you who looked at the post and thought TLDR (too long didn’t read), a site migration is a significant project which affects multiple digital channels and should, therefore, be performed with great planning and care. For the greatest chance of success, be sure to follow the processes in this migration checklist so you aren’t spending a large chunk of the post-migration period chasing your own tail.

Remember to ask questions early, pull all necessary data with plenty of time, test and retest your 301 redirects before launch and consider the impact of site migration on wider channels. Migrating a site takes a lot of effort, but if done properly, the rewards can be plentiful.

Useful tools

Source: Distilled

    

Getting Started with Measuring Brand Awareness

By Tom Capper

Few people dispute that brand awareness is an important consideration for companies of all sizes – there’s a half-trillion dollar global advertising industry built largely on that premise, after all. In the SEO industry, however, we’re probably not as aware of it as we should be. I recently published a study on Moz showing that branded search volume is better correlated with organic search ranking in Google than Domain Authority, and as Google gets smarter and links become increasingly unrepresentative of how the web works, we can only expect this relationship to deepen.

As the digital marketing and offline marketing industries converge, we as digital marketers have a lot to offer in terms of measurement and understanding that hasn’t historically been available to brand marketing. Similarly, there are pre-internet methodologies that can fill in some of the gaps in measurement that we as digital marketers tend to give up on too easily.

In this post, I want to combine the best of both and outline how you can get started with measuring your brand awareness, and thus hopefully the impact of your campaigns. How do you quantify brand awareness? There are four main ways of posing this question, which I’ll answer in turn:

  1. How many people are looking for you?

  2. How many people want to hear from you?

  3. How many people would recommend you?

  4. How many people have heard of you?

The fourth is probably closest to what most of us have in mind when we think about brand awareness, but all of these questions are important, particularly for digital. Platforms like Google and Facebook can easily answer these questions about your brand with their proprietary data, and they’ll definitely be looking to use it to decide which companies to put in front of their users.

How many people are looking for you?

Branded search & Google Trends

These two are a great place to get started, and even the smallest brand should be able to measure movement in branded search impressions as reported by Google Search Console.

To add a bit of nuance, you’ll obviously want to take into account seasonality and day by day fluctuation, but you should also look for specific combinations. For example:

  • Distilled events

  • Distilled blog

  • Distilled SEO

People searching for these branded terms might be a great indicator that specific parts of a marketing strategy are having their intended effect.

Direct traffic to homepage

This is a much muddier source, particularly because so much “dark traffic” is bundled in – most notably from untagged links in apps and emails, and from organic search. However, people typing your domain name into their address bar shows is a great signal in just the same way that branded search is – it represents that they are looking for you, and you in particular, and direct traffic is the best way for you to see how frequently this happens.

Vanity URLs

Vanity URLs are fairly commonly used in offline advertising campaigns – typically an easily memorable URL that redirects to a URL with tracking parameters. The problem with these is that lots of people who see the given campaign will still use other methods to navigate to your site if they’re prompted by the campaign to do so, and the effectiveness of the campaign may play out in other ways (for example, people may recognise you in the search results a year later). However, what vanity URLs are good for is demonstrating the minimum number of visitors who are definitely attributable to the campaign.

How many people want to hear from you?

Social following

Social follows are one of the strongest signals that people not only know about your brand, but actually like it enough to be seen publicly as liking it, and enough to read your social content. Even if your social content is fairly light and non-promotional, this can only be a good thing, and again it’s super easy to contrast your normal growth rate of followers with your growth rate after a campaign.

Email subscription

This is a similar logic to social follows, although arguably a weaker endorsement. Even though most brands value email addresses above followers, because of the higher directly attributable conversion numbers, an email subscription could just mean that an individual is on the lookout for bargains. This will depend on what call to action you used when you got them to sign up.

How many people would recommend you?

Backlinks

Given my earlier remark about Domain Authority vs. branded search volume, this is pretty ironic, isn’t it? But this is precisely what makes SEO so hard – all of these signals are related to each other, as well as with rankings. Linking to you (organically!) is a great endorsement of your brand, and something that the SEO space has gotten remarkably good at measuring.

The context, target and anchor of the text can change what this means, though. Which brings me onto:

Referral traffic

The best way to tell whether a link is an effective endorsement is people clicking on it to visit your site. Referral traffic can often be fairly muddy, with various forms of search engines, Google traffic ( particularly AMP), or internal traffic. However, if you’re willing to take the time to create and maintain a segment or filter, you can get through to the more valuable parts for measuring brand awareness and visibility.

Word of mouth referrals

You’ll never know exactly how often this happens, but there are a few ways to get a feel for the trend. Most practically, you can ask people in the conversion funnel how they heard about you, although test to make sure you’re doing this in a way that doesn’t impact conversion rate – as a general rule, adding extra complexity or steps into a conversion process isn’t a great idea, so you need to proceed with caution.

How many people have heard of you?

Survey data

TV advertising and large-scale PR will often be measured through polling. There are some obvious limitations – namely the high cost, and the fact that you have to be doing something pretty big or for a long time for it to be measurable by this method. That said, services like Google Consumer Surveys have made this sort of approach dramatically more accessible with prices as low as $0.10c per respondent.

If you think you’re at a point where enough people have heard of you for your reputation to be measurable in randomly sampled population surveys, there are a few different ways of approaching it. Choosing the most suitable option will come down to what exactly you’re looking to measure. For example, contrast the following:

  1. What services does Distilled offer?

  2. Name 3 digital marketing companies.

  3. Do you recognise any of these brands? (e.g. Distilled, Seer, iPullRank)

The second is the closest to being brand awareness measurement, but you might find it easier to detect movement in the first or third.

Despite their limitations, surveys are the best answer to this question, because the other methods available only measure a proxy for awareness, not awareness itself.

Impressions

These can take many forms – it might be the number of people who’ve seen a web page, or the views on a YouTube video. It’s probably the clumsiest method in this article, in that you can tell how many people saw something with your brand on it, but not whether they were paying attention, or noticed your brand. Still, if you’re comparing your campaign with other media, and most notably TV, this is sometimes the only comparison available.

Share of voice & mentions

A number of tools will track and calculate estimates of share of voice or mentions. This can range from simple Google Alerts up to sophisticated and expensive social network tools such as Ubermetrics. The problem is that, like impressions, you’re measuring the opportunities for people to see your brand, not their actual awareness of it. But again, it’s a proxy which you can use to tell whether you’re going in the right direction.

Social interactions

This is by far the simplest way to get an estimate of how many people noticed a campaign or product. It has some of the same problems as impressions or mentions, but at least requires the user to be a little more involved in what they were looking at.

Separating impact from noise

In all of the above, context is vital. So what if 20,000 people searched for your brand on Google this month? Is that more than would have done without the campaign you ran? Is that a lot?

In order to accurately benchmark, you need something to benchmark against, and realistically this is either yourself or a competitor.

Some metrics lend themselves well to the latter – your competitors’ branded search volume or social follower counts are as easy to track as your own.

Other metrics don’t lend themselves well to that and comparing to competitors doesn’t help you attribute success to specific events. For that, you need to either use a metric that can be specific to a campaign (such as impressions or social shares), or get into a bit of statistics and compare what actually happened with what would have happened without your intervention.

Fortunately, this is more accessible than it sounds. I published a free tool here on Distilled back in 2015, called Distilled Forecaster, which is designed to help you product forecasts for any given data series, which can act as a basis for a counterfactual. You can get started here.

Discussion

How does your business measure brand awareness? Let us know in the comments below!

Source: Distilled