MOZ

Scaling Geo-Targeted Local Landing Pages That Really Rank and Convert – Whiteboard Friday

Scaling Geo-Targeted Local Landing Pages That Really Rank and Convert - Whiteboard Friday

Posted by randfish

One question we see regularly come up is what to do if you're targeting particular locations/regions with your site content, and you want to rank for local searches, but you don't actually have a physical presence in those locations. The right track can depend on a few circumstances, and in today's Whiteboard Friday, Rand helps you figure out which one is best for your organization.

For reference, here's a still of this week's whiteboard!

Video Transcription

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week we're talking about geo-targeted or geo-specific local landing pages for companies that are trying to reach many geographic regions and need to have that scale, but don't necessarily have a local physical location in every city they're trying to target.

So if you can imagine I'm going to use the fictitious Rand's Whisky Company, and Rand's Whisky Company is going to be called Specialty Whisky. We're going to be running events all over the country in all sorts of cities. We're going to be trying to reach people with a really local approach to whisky, because I'm very passionate about whisky, and I want everyone to be able to try scotches and bourbons and American whiskies as well.

Okay, this sounds great, but there's going to be a big challenge. Rand's Whisky Company has no physical location in any city other than our main Seattle headquarters. This is a big challenge, and I've talked to many startups and many companies who have this same problem. Essentially they need to rank for a core set of terms in many different geographies.

So they might say, "Hey, we want to be in Nashville, Tennessee, and in Atlanta, Georgia, and we've identified a lot of whisky consumers in, let's say, Baton Rouge, Louisiana. But we can't open physical, local office space in every one of those geographies. In fact, we could probably only start by having a Web presence in each of those. We haven't yet necessarily achieved sort of scale and service in every single one of those geos." It's not like I'm running events in every one of these the day I start. I might start with Seattle and Portland and maybe Boise, Idaho, or Spokane or something like that, and then eventually I'll grow out.

This presents a big challenge in search results, because the way Google's results work is that they bias to show kind of two things in a lot of categories. They'll try and show you the local purveyors of whatever it is that the person is searching for in the local or maps results. Obviously, Pigeon had a big change and update to these, changing the geographic areas and changing the ordering of those results and now many map results show up, and those sorts of things. But obviously they're still very present.

We see them a lot. MozCast sees a very high percent of local intent queries even sometimes without the city modifier. If you're in a geography, you search for a whisky store, and you know what? Liquor stores and specialty liquor companies and that kind of stuff, they're going to show up in your search results here in Seattle in those Maps local boxes. So that makes it tough.

Then the other category is, of course, the organic web results. That's where folks like this, Rand's Whisky Company and other folks who are trying to scale their local presence, need to show up because you really won't have an opportunity in those local results unless and until you have true local physical space. So you're aiming for those Web results.

You're oftentimes competing with people like Yelp and Angie's List. A lot of the old Yellow Pages folks are in their directories and guides. Then sometimes, occasionally there will be a company that does a great job with this.

So there two companies that I want to call out. One is Uber, which everyone is pretty much familiar with, and Uber has done a great job of having their website contained in unique portals for each city in which they operate, unique social accounts, unique blogs. They really have put together a segmented operation that targets each city that they're in. They do have physical space, so they're cheating a little bit on this front.

Then another one is a company called Ride the Ducks, and Ride the Ducks has different websites for every city that they operate in. So there's a duck tour in Boston, a duck tour in Seattle, a duck tour in Los Angeles, all this kind of stuff. You can ride the ducks in any of these cities.

Now let's say that you're a startup or a company starting out, and you're thinking, "Okay, fine. I'm going to have my Specialty Whisky page for Seattle, and I'll just put some generic information in there, and then I'll replace Seattle with Portland, with Los Angeles, with Baton Rouge." That's my Baton Rouge page. That's my Los Angeles page. This is called the find and replace.

Even if you push this out, even if you customize some of the content on this page, try and make it a little more specific, have a few addresses or locations, you will fail. Unfortunately, Angie's List, who I mentioned, they do a really terrible job of this. They have a lot of pages that are what I call find and replace pages. You could just plug in nearly any city, and that's what the results would look like. They do rank. They are ranking because they were early and because they've got a lot of domain authority. Do not think that you can copy their content strategy and succeed.

The next one is a little bit more scaled out. This is a little bit more like what someone such as a Yelp or TripAdvisor might do for some of their landing pages. They've got some unique info in each city. It's the same for each city, but it's scaled out and it's relatively comprehensive. So, my Specialty Whisky Seattle page might show our favorite bars in Seattle. It might show some recommended stores where you can buy whisky. It might show some purveyors, some vendors, that we like. It could have some local events listed on the page. Fine, great. That could be good enough if the intent is always the same.

So if every city's intent, the people who are searching for restaurants in Portland versus restaurants in Seattle, you're basically looking for the same thing. It's the same kind of people looking for the same kind of thing, and that's how Yelp and TripAdvisor and folks like that have scaled this model out to success.

If you want to take it even one step further, my final recommendation is to go in that direction of what Uber and Ride the Ducks and those types do, which is they essentially have a customized experience created by a local team in that city, even if they don't necessarily have a physical office. Uber, before they open the physical office, will send people out. They'll go team gathering. Yelp did this, too, in their history as they were scaling out.

That kind of thing is like, "Hey, we've got some photos from some of our events. We've got a representative in the city." This is Seattle Whiskey Pete, and Whiskey Pete says, "Yar, you should buy some whiskey." It's got a list of events. So Knee High is stocking up for the holiday (presumably at the Knee High Stocking Company, which is a great little speakeasy here in Seattle), and whisky at Bumbershoot. You can follow our @WhiskySeattle account on Twitter, and that's different from our @WhiskyPortland, our @WhiskyLosAngeles or our @WhiskyNewYork accounts. Great.

There's a bunch of top Seattle picks. So this is a very customized page. This experience is completely owned and controlled by a team that's focused purely on Seattle. This is sort of the Holy Grail. It's hard to scale to this, which is why this other approach can really be okay for a lot of folks trying to scale up and rank for all of those geo terms plus their keywords.

What's the process by which you go about this? I'm glad you asked because I wrote it down. Number one, we want to try and determine the searcher's intent and how we can satisfy the query and at the same time delight visitors. We've got to create a unique, special experience for them and delight visitors in addition to satisfying their query.

So for Seattle whisky, I can show them where they can buy whisky in the city. I can recommend some bars that have a great whisky selection, and then I can delight them by showing some tips and tricks from our community. I can delight them by giving them special priority access to events. I can delight them by giving them a particular guide that they could print out and take with them or the ability to register for special things that they couldn't get elsewhere, buy whiskies that they'd never be able to get, whatever it is, something special to delight them.

Number two, I want to select the group of keywords, and I say group because usually there are a few keywords in every one of the verticals that I've talked to people about. There are usually between 3 and about 20 sets of keywords that they really, deeply care about per each geography. Do be careful. You've got to be wary of local colloquialisms. For example, if you're in the United States, whiskey is often spelled with an "e", W-H-I-S-K-E-Y, whereas in the U.K. and most of Europe, most of the rest of the English language speaking world, it's spelled W-H-I-S-K-Y with no "e".

Also you want to take those groups, and you want to actually combine them. So say I've got a bunch of keywords over here. I might want to say, "Hey, you know what? These three keywords, whisky tastings and whisky events, that's the same intent." I don't need to create two different landing pages for those. Let's take those and bunch them up and group them and make that one page. That'll be our Seattle Whisky Events page, and we'll target tastings and events and festivals and whatever other synonyms might go in there.

Third, I want to create a few of these, one of these two models of really amazing pages as a sample, as an instruction for all future ones. This is what we want to get to. Let's make the best, most perfect page for Seattle, and then we'll go make one for Portland and we'll go make one for Los Angeles. Then we'll see how do we get that into a process that will scale for us. You want that process to be repeatable. You want it to be well-defined. You want it to be so that a content team, who comes in, or contractor, an agency can take that document, can look at the examples, and replicate that on a city by city basis. That's going to require a lot of uniqueness. You need to have those high bars set up so that they can achieve them.

The fourth and last thing for these pages you're creating is you've got to be able to answer this question: Who will amplify this page and why? By amplify, I mean share socially, share via word of mouth, share via email, link to it. Who will amplify it and why? How are we going to reach them?

Then go get them. Go prove to yourself that with those two or three amazing example pages that you made that you can actually do it, and then make that part of your scaling process.

Now you've got something where you can truly say, "Yes, we can go geo by geo and have the potential to rank in market after market for the terms and phrases that we care about in the organic results."

Long term, if you have a lot of success in a city, my next suggestion would be that you move from this model to this model where you actually have a local team, just one person, even a contractor, someone who visits. It doesn't have to be a permanent resident of that city. It can be someone who goes there a month out of the year, whatever it is, every few weekends and owns that page and that experience and that section of your site for that specific geo that produces remarkable results.

They build relationships. That furthers your press, and that furthers your brand in that town. There's a lot of opportunity there. So that's eventually where you want to move to.

All right, everyone. I hope all of you out there who are building local, geo-targeted landing pages at scale have found this valuable, and I hope you're going to go build some phenomenal pages. Maybe someone will even start a whisky company for me.

All right, everyone. Take care. We'll see you again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |October 17th, 2014|MOZ|0 Comments

Open Site Explorer’s New Link Building Opportunities Section (and a Slight Redesign)

Posted by randfish

Why hello there! You're looking marvelous today, you really are. And, in other good news, Open Site Explorer has a bit of a new look—and an entirely new section called "Link Opportunities" to help make some link prospecting tasks easier and more automated. Come with me and I'll show you; it'll be fun :-)

The new look

We know a lot of folks liked the old tab structure but we ran out of space. With this redesign we now have the flexibility to add new features and functionality simply by popping in new sections on the left sidebar menu. It's a little bit more like Moz Analytics, too, and we figure some cohesion between our products is probably wise.

  • New side navigation with plenty of room to grow and add new features (spam scoring and analysis, for example, will be coming in Q4—but shhh... I didn't actually ask for permission to talk about that yet. I figure begging forgiveness will work.)
  • Improved filtering that lets you slice and dice your link data more easily.
  • Notice How Fast the New OSE Is? Oh yeah, that's the stuff :-)

You can still access the old Open Site Explorer's design for a few more weeks, but the new features will exist only in the new version.

Introducing the new link opportunities section

Need help finding outreach targets for your link building campaign? We're introducing three new reports that will help you build a curated list of potential targets. The new reports are available to all Moz Pro subscribers. If you're a community member, sign up for a Moz Pro Free Trial and you, too, can kick it with the new functionality.


Reclaim links

A filtered view of Top Pages that lets you easily export a ranked list of URLs to fix.


Unlinked mentions

Powered by FreshScape, you can use Fresh Web Explorer queries to find mentions of a brand or site without links. Ping sources that may have talked about your brand, website, people, or products without giving you a link and you can often encourage/nudge that link into existence (along with the great SEO benefits they bring)


Link intersect

Find pages that are linking to your competitors but not you. By entering two competitive domains (they don't have to be directly competitive; anyone you think you should be on lists with, or mentioned by the press alongside, is a good candidate), you can see pages that link to those sites but not yours. Getting creative with your targets here can reveal loads of awesome link opportunities.


This, however, is just the beginning. Be on the lookout for additional insights and opportunities as we improve our link index—we've just recently grown the size of Freshscape, which powers Fresh Web Explorer and two of the sections in link opportunities, so you should find lots of good stuff in there, but it can be a challenge. If you're struggling with query formatting or getting creative around potential opportunities, let us know (in the comments or via Q&A) and we can give you some pointers or maybe find some searches that do the trick.

What about the old OSE?

We changed the workflow a bit and want to make sure you've got time to adjust. If you're cranking through monthly reports or audits and want a more familiar OSE experience, you can switch to OSE Classic for a limited time. Just click on the "View in OSE Classic" link in the top right, and we'll default to the old version.

But keep in mind new features and enhancements, like improved performance and Link Opportunities, will only be available in the new release. We'll keep OSE Classic active until December 3rd in case you're feeling nostalgic.

We'd love your feedback

If you're using the new OSE and find problems, wish we'd change something, or have a particularly awesome experience, we'd love to hear from you in the comments below, in Q&A, or (especially if your issue is urgent/something broken) via our help team.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |October 15th, 2014|MOZ|0 Comments

Experiment: We Removed a Major Website from Google Search, for Science!

Posted by Cyrus-Shepard

The folks at Groupon surprised us earlier this summer when they reported the results of an experiment that showed that up to 60% of direct traffic is organic.

In order to accomplish this, Groupon de-indexed their site, effectively removing themselves from Google search results. That's crazy talk!

Of course, we knew we had to try this ourselves.

We rolled up our sleeves and chose to de-index Followerwonk, both for its consistent Google traffic and its good analytics setup—that way we could properly measure everything. We were also confident we could quickly bring the site back into Google's results, which minimized the business risks.

(We discussed de-indexing our main site moz.com, but... no soup for you!)

We wanted to measure and test several things:

  1. How quickly will Google remove a site from its index?
  2. How much of our organic traffic is actually attributed as direct traffic?
  3. How quickly can you bring a site back into search results using the URL removal tool?

Here's what happened.

How to completely remove a site from Google

The fastest, simplest, and most direct method to completely remove an entire site from Google search results is by using the URL removal tool.

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

After submitting the request, Followerwonk URLs started disappearing from Google search results in 2-3 hours.

The information needs to propagate across different data centers across the globe, so the effect can be delayed in areas. In fact, for the entire duration of the test, organic Google traffic continued to trickle in and never dropped to zero.

The effect on direct vs. organic traffic

In the Groupon experiment, they found that when they lost organic traffic, they actually lost a bunch of direct traffic as well. The Groupon conclusion was that a large amount of their direct traffic was actually organic—up to 60% on "long URLs".

At first glance, the overall amount of direct traffic to Followerwonk didn't change significantly, even when organic traffic dropped.

In fact, we could find no discrepancy in direct traffic outside the expected range.

I ran this by our contacts at Groupon, who said this wasn't totally unexpected. You see, in their experiment they saw the biggest drop in direct traffic on long URLs, defined as a URL that is at least as long enough to be in a subfolder, like http://followerwonk.com/bio/?q=content+marketer.

For Followerwonk, the vast majority of traffic goes to the homepage and a handful of other URLs. This means we didn't have a statistically significant sample size of long URLs to judge the effect. For the long URLs we were able to measure, the results were nebulous.

Conclusion: While we can't confirm the Groupon results with our outcome, we can't discount them either.

It's quite likely that a portion of your organic traffic is attributed as direct. This is because of different browsers, operating systems and user privacy settings can potentially block referral information from reaching your website.

Bringing your site back from death

After waiting 2 hours, we deleted the request. Within a few hours all traffic returned to normal. Whew!

Does Google need to recrawl the pages?

If the time period is short enough, and you used the URL removal tool, apparently not.

In the case of Followerwonk, Google removed over 300,000 URLs from its search results, and made them all reappear in mere hours. This suggests that the domain wasn't completely removed from Google's index, but only "masked" from appearing for a short period of time.

What about longer periods of de-indexation?

In both the Groupon and Followerwonk experiments, the sites were only de-indexed for a short period of time, and bounced back quickly.

We wanted to find out what would happen if you de-indexed a site for a longer period, like two and a half days?

I couldn't convince the team to remove any of our sites from Google search results for a few days, so I choose a smaller personal site that I often subject to merciless SEO experiments.

In this case, I de-indexed the site and didn't remove the request until three days later. Even with this longer period, all URLs returned within just a few hours of cancelling the URL removal request.

In the chart below, we revoked the URL removal request on Friday the 25th. The next two days were Saturday and Sunday, both lower traffic days.

Test #2: De-index a personal site for 3 days

Likely, the URLs were still in Google's index, so we didn't have to wait for them to be recrawled.

Here's another shot of organic traffic before and after the second experiment.

For longer removal periods, a few weeks for example, I speculate Google might drop these semi-permanently from the index and re-inclusion would comprise a much longer time period.

What we learned

  1. While a portion of your organic traffic may be attributed as direct (due to browsers, privacy settings, etc) in our case the effect on direct traffic was negligible.
  2. If you accidentally de-index your site using Google Webmaster Tools, in most cases you can quickly bring it back to life by deleting the request.
  3. Reinclusion happens quickly even after we removed a site for over 2 days. Longer than this, the result is unknown, and you could have problems getting all the pages of your site indexed again.

Further reading

Moz community member Adina Toma wrote an excellent YouMoz post on the re-inclusion process using the same technique, with some excellent tips for other, more extreme situations.

Big thanks to Peter Bray for volunteering Followerwonk for testing. You are a brave man!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |August 12th, 2014|MOZ|1 Comment

Beyond Search: Unifying PPC and SEO on the Display Network

ga-remarketing-list-types.PNG

Posted by anthonycoraggio

PPC and SEO go better together. By playing both sides of the coin, it's possible to make more connections and achieve greater success in your online marketing than with either alone.

That the data found in search query reporting within AdWords can be a valuable source of information in keyword research is well known. Managing the interaction effects of sharing the SERPs and capturing reinforcing real estate on the page is of course important. Smart marketers will use paid search to test landing pages and drive traffic to support experiments on the site itself. Harmony between paid and organic search is a defining feature of well executed search engine marketing.

Unfortunately, that's where the game all too often stops, leaving a world of possibilities for research and synergy waiting beyond the SERPs on the Google Display Network. Today I want to give you a couple techniques to kick your paid/organic collaboration back into gear and get more mileage from combining efforts across the disciplines.

Using the display network

If you're not familiar with it already, the GDN is essentially the other side of AdSense, offering the ability to run banner, rich media, and even video ads across the network from AdWords or Doubleclick. There are two overarching methods of targeting these ads: by context/content, and by using remarketing lists. Regardless of your chosen method, ads here are about as cheap as you can find (often under a $1 CPC), making them a prime tool for exploratory research and supporting actions.

Contextual and content-based targeting offers some simple and intuitive ways to extend existing methods of PPC and SEO interaction. By selecting relevant topics, key phrases, or even particular sites, you can place ads in the wild to test the real world resonance of taglines and imagery with people consuming content relevant to yours.

You can also take a more coordinated approach during a content marketing campaign using the same type of targeting. Enter a unique phrase from any placements you earn on pages using AdSense as a keyword target, and you can back up any article or blog post with a powerful piece of screen real estate and a call to action that is fully under your control. This approach mirrors the tactic of using paid search ads to better control organic results, and offers a direct route to conversion that usually would not otherwise exist in this environment.

Research with remarketing

Remarketing on AdWords is a powerful tool to drive conversions, but it also produces some very interesting and frequently neglected data in the proces: Your reports will tell you which other sites and pages your targeted audience visits once your ads display there. You will, of course, be restricted here to sites running AdSense or DoubleClick inventory, but this still adds up to over 2 million potential pages!

If your firm is already running remarketing, you'll be able to draw some insights from your existing data, but if you have a specific audience in mind, you may want to create a new list anyway. While it is possible to create basic remarketing lists natively in AdWords, I recommend using Google Analytics to take advantage of the advanced segmentation capabilities of the platform. Before beginning, you'll need to ensure that your AdWords account is linked and your tracking code is updated.

Creating your remarketing list

First, define who exactly the users you're interested in are. You're going to have to operationalize this definition based on the information available in GA/UA, so be concrete about it. We might, for example, want to look after users who have made multiple visits within the past two weeks to peruse our resources without completing any transactions. Where else are they bouncing off to instead of closing the deal with us?

If you've never built a remarketing list before, pop into the creation interface in GA through Admin > Remarketing > Audiences. Hit the big red '+ Audience' button to get started. You're first presented with a selection of list types:

The first three options are the simplest and least customizable, so they won't be able to parse out our theoretical non-transactors, but can be handy for this application nonetheless. The Smart List option is a relatively new and interesting option. Essentially, this will create a list based on Google's best algorithmic guess at which of your users are most likely to convert upon return to your site. The 'black box' element to Smart Lists makes it less precise as a tool here, but it's simple to test and see what it turns up.

The next three are relatively self explanatory; you can gather all users, all users to a given page, or all that have completed a conversion goal. Where it gets truly interesting is when you create your own list using segments. All the might of GA opens up here for you to apply criteria for demographics, technology/source, behavior, and even advanced conditions and sequences. Very handily, you can also import any existing segments you've created for other purposes.

In this figure, we're simply translating the example from above into some criteria that should fairly accurately pick out the individuals in which we are interested.

Setting up and going live

When you've put your list together, simply save it and hop back over to AdWords. Once it counts at least 100 users in its target audience, Google will let you show ads using it as targeting criteria. To set up the ad group, there are a few key considerations to bear in mind:

  1. You can further narrow your sample using AdWords' other targeting options, which can be very handy. For example, want to know only what sites your users visit within a certain subject category? Plug in topic targeting. I won't jump down the rabbit hole of possibilities here, but I encourage you to think creatively in using this capability.
  2. You'll of course need fill the group with some actual ads for it to work. If you can't get some applicable banner ads, you can create some simple text ads. We might be focusing on the research data to be had in this particular group, but remember that users are still going to see and potentially click these ads, so make sure you use relevant copy and direct them to an appropriate landing page.
  3. To hone down on unique and useful discoveries, consider setting some of the big generic inventory sources like YouTube as negative targets.
  4. Finally, set a reasonable CPC bid to ensure your ads show. $0.75 to $1.00 should be sufficient; if your ads aren't turning up many impressions with a decent sized list, push the number up a bit.

To check on the list size and status, you can find it in Shared Library > Audiences or back in GA. Once everything is in place, set your ads live and start pulling in some data!

Getting the data

You won't get your numbers back overnight, but over time you will collect a list of the websites your remarketed ads show on: all the pages across the vast Google Display Network that your users visit. To find it, enter AdWords and select the ad group you set up. Click the "Display Network" and "Placements" tabs:

placement-data-tabs.PNG

You'll see a grid showing the domain level placements your remarketing lists have shown on, with the opportunity to customize the columns of data included. You can sift through the data on a more granular level by clicking "see details;" this will provide you with page level data for the listed domains. You're likely to see a chunk of anonymized visits; there is a workaround to track down the pages in here, but be advised it will take a fair amount of extra effort.

plaecments-see-details.png

Tada! There you are—a lovely cross section of your target segment's online activities. Bear in mind you can use this approach with contextual, topic, or interest targeting that produces automatic placements as well.

Depending on your needs, there are of course myriad ways to make use of display advertising tools in sync with organic marketing. Have you come up with any creative methods or intriguing results? Let us know in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |August 11th, 2014|MOZ|2 Comments

Syndicating Content – Whiteboard Friday

Posted by Eric Enge

It's hard to foresee a lot of benefit to your hard work creating content when you don't have much of a following, and even if you do, scaling that content creation is difficult for any marketer. One viable answer is syndication, and in this Whiteboard Friday, Eric Enge shows you both reasons why you might want to syndicate as well as tips on how to go about it.

Heads-up! We published a one-two punch of Whiteboard Friday videos from our friends at Stone Temple Consulting today. Check out "I See Content (Everywhere)" by Mark Traphagen, too!

For reference, here's a still of this week's whiteboard!

Video transcription

Hi everybody. I'm Eric Enge, CEO of Stone Temple Consulting. Welcome to another edition of Whiteboard Friday, and today we're going to be talking about syndicated content. I probably just smeared my picture, but in any case, you hear about syndicated content and the first thing that comes across your mind is, "Doesn't that create duplicate content, and isn't somebody going to outrank me for my own stuff?" And it is a legitimate concern. But before I talk about how to do it, I want to tell you about why to do it, because there are really, really good sound reasons for syndicating content.

Why (and how) should I syndicate my content?

So first of all, here is your site. You get to be the site in purple by the way, and then here is an authority site, which is the site in green. You have an article that you've written called, "All About Fruit," and you deliver that article to that authority site and they publish the same article, hence creating the duplicate content. So why would you consider doing this?

Well, the first reason is that by association with a higher authority site there is going to be some authority passed to you, both from a human perspective from people that see that your content is up there. They see that your authored content is on this authority site. That by itself is a great thing. When we do the right things, we're also going to get some link juice or SEO authority passed to you as well. So these are really good reasons by itself to do it.

But the other thing that happens is you get exposure to what I call OPA or Other People's Audiences, and that's a very helpful thing as well. These people, as I've mentioned before, they're going to see you here, and this crowd, some of this crowd is going to start to become your crowd. This is great stuff. But let's talk about how to do it. So here we go.

Three ways to contentedly syndicate content

#1 rel=canonical

There are three ways that you can do this that can make this work for you. The first is, here's your site again, here's the authority site. You get the authority site to implement a rel=canonical tag back to your page, the same page, the exact article page on your site. That tells Google and Bing that the real canonical version of the content is this one over here. The result of that is that all of the PageRank that accrues to this page on the authority site now gets passed over to you. So any links, all the links, in fact, that this page gets now gets passed through to you, and you get the PageRank from all that. This is great stuff. But that's just one of the solutions. It's actually the best one in my opinion.

#2 meta noindex

The second best one down here, okay, same scenario -- your site, the authority's site. The authority's site implements a meta no index tag on their page. That's an instruction to the search engine to not keep this page in the index, so that solves the duplicate content problem for you in a different way. This does as well, but this is a way of just taking it out of the index. Now any links from this page here over to your page still pass PageRank. So you still want to make sure you're getting those in the process. So a second great solution for this problem.

#3 Clean Link to Original Article

So these are both great, but it turns out that a lot of sites don't really like to do either of these two things. They actually want to be able to have the page in the index, or they don't want to take the trouble to do this extra coding. There is a third solution, which is not the best solution, but it's still very workable in the right scenarios. That is you get them to implement a clean text link from the copied page that they have on their site over to your site, to the same article on your site. The search engines are pretty good at understanding, when they see that link, that it means that you're the original author. So you're still getting a lot of authority passed, and you're probably eliminating a duplicate content problem.

So again, let's just recap briefly. The reason why you want to go through this trouble is you get authority from the authority site passed to you, both at a human level and at an SEO level, and you can gain audience from the audience of that authority site.

So that's it for this edition of Whiteboard Friday.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |August 8th, 2014|MOZ|1 Comment