MOZ

Local Search Expert Quiz: How Much Do You Know about Local SEO?

Posted by Cyrus-Shepard

How big is local SEO?

Our latest Industry Survey revealed over 67% of online marketers report spending time on local search. We've witnessed demand for local SEO expertise grow as Google's competitive landscape continues to evolve.

Last year, Moz introduced the SEO Expert Quiz, which to date over 40,000 people have attempted to conquer. Today, we're proud to announce the Local Search Expert Quiz. Written by local search expert Miriam Ellis, the quiz contains 40 questions and only takes less than 10 minutes to complete.

Ready to get started? When you are finished, we'll automatically score your quiz and reveal the correct answers.

View Survey

Rating your score

Keep in mind the Local Search Expert Quiz is just for fun. That said, we've established the following guidelines to help judge your results.

  • 0-39% Newbie: Time to study up on your citation data!
  • 40-59% Beginner: Good job, but you're not quite in the 7-pack yet.
  • 60-79% Intermediate: You're getting close to the centroid!
  • 80-89% Pro: Let's tackle multi-location!
  • 90-100% Guru: We all bow down to your local awesomeness

Resources to improve your performance

Want to learn more about local search? Here's a collection of free learning resources to help up your performance (and possibly your income.)

  1. The Moz Local Learning Center
  2. Glossary of Local Search Terms and Definitions
  3. Guidelines for Representing Your Business on Google
  4. Local Search Ranking Factors
  5. Blumenthal's Blog
  6. Local SEO Guide
  7. Whitespark Blog

You can also learn the latest local search tips and tricks by signing up for the LocalUp Advanced one-day conference or reading local SEO posts on the Moz Blog.

Embed this Quiz

We created this quiz using Polldaddy, and we're making it available to embed on your own site. This isn't a backlink play - we didn't even include a link to our own site (but feel free to include one if you feel generous).

Here's the embed code:

<iframe frameborder="0" width="100%" height="600" scrolling="auto" allowtransparency="true" src="http://mozbot.polldaddy.com/s/local-search-expert-quiz?iframe=1"><a href="http://mozbot.polldaddy.com/s/local-search-expert-quiz">View Survey</a></iframe>

How did you score on the quiz? Let us know in the comments below!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 13th, 2015|MOZ|0 Comments

How to Have a Successful Local SEO Campaign in 2015

Posted by Casey_Meraz

Another year in search has passed. It's now 2015 and we have seen some major changes in local ranking factors since 2014, which I also expect to change greatly throughout 2015. For some a new year means a fresh starting point and yet for others it's a time of reflection to analyze how successful your campaign has been. Whatever boat you're in, make sure to sit down and read this guide.

In this guide we will cover how you can have a successful local SEO campaign in 2015 starting with the basics and getting down to five action items you should focus on now. This is not limited to Google My Business and also includes localized organic results.

Now the question is where do you start?

Since Pigeon has now rolled out to the US, UK, Australia, and Canada it's important to make sure your strategies are in line with this no matter what part of the world you're in. A successful local SEO Campaign in 2015 will be much more successful if you put more work into it. Don't be fooled though. More work by itself isn't going to get you where you need to be. You need to work smarter towards the goals which are going to fuel your conversions.

For some industries that might mean more localized content, for others it may mean more social interaction in your local area. Whatever it ends up being, the root of it should be the same for most. You need to get more conversions for your website or your client's website. So with this in mind let's make sure we're on the same page as far as our goals are concerned.

Things you need to know first

Focus on the right goals

Recently I had a conversation with a client who wanted to really nail in the point that he was not interested in traffic. He was interested in the conversions he could track. He was also interested to see how all of these content resource pieces I recommended would help. He was tired of the silly graphs from other agencies that showed great rankings on a variety of keywords when he was more interested to see which efforts brought him the most value. Instead, he wanted to see how his campaign was bringing him conversions or meaningful traffic. I really appreciated this statement and I felt like he really got it.

Still, however, far too often I have to talk to potential clients and explain to them why their sexy looking traffic reports aren't actually helping them. You can have all of the traffic in the world but if it doesn't meet one of your goals of conversions or education then it's probably not helping. Even if you make the client happy with your snazzy reports for a few months, eventually they're going to want to know their return on investment (ROI).

It's 2015. If your clients aren't tracking conversions properly, give them the help they need. Record their contacts in a CRM and track the source of each of these contacts. Track them all the way through the sales funnel.

That's a simple and basic marketing example but as SEOs your role has transformed. If you can show this type of actual value and develop a plan accordingly, you will be unstoppable.

Second, don't get tunnel vision

You may wonder why I started a little more basic than normal in this post. The fact is that in this industry there is not a full formal training program that covers all aspects of what we do.

We all come from different walks of life and experience which makes it easy for us to get tunnel vision. You probably opened this article with the idea of "How Can I Dominate My Google Local Rankings?" While we cover some actionable tips you should be using, you need to think outside of the box as well. Your website is not the only online property you need to be concerned about.

Mike Ramsey from Nifty Marketing put out a great study on measuring the click-through rates from the new local stack. In this study he measured click-through rates of users conducting several different searches like "Salt Lake City Hotel" in the example below. With so many different options look where the users are clicking:

They're really clicking all over the place! While it's cool to be number one, it's much better if you get clicks from your paid ad, organic result, local result, and barnacle SEO efforts (which we'll talk about a little later).

If you combine your conversion marketing data with your biggest priorities, you can put together a plan to tackle the most important areas for your industry. Don't assume it's a one-size-fits-all approach.

Third, some spam still works. Don't do it and rise above it.

There's no doubt that some spammy tactics are still working. Google gets better everyday but you still see crap like this example below show up in the SERPs.

While it sucks to see that kind of stuff, remember that in time it disappears (just as it did before this article was published). If you take shortcuts, you're going to get caught and it's not worth it for the client or the heartache on your site. Maintain the course and do things the right way.

Now let's get tactical and prepare for 2015

Now it's time for some practical and tactical takeaways you can use to dominate your local search campaign in 2015.

Practical tip 1: start with an audit

Over the years, one of the best lessons I have learned is it's OK to say "I don't know" when you don't have the answer. Consulting with industry experts or people with more experience than you is not a bad thing and will likely only lead to you to enhance your knowledge and get a different perspective. It can be humbling but the experience is amazing. It can open your mind.

Last year, I had the opportunity to work with over ten of the industry's best minds and retained them for site audits on different matters.

The perspective this gives is absolutely incredible and I believe it's a great way to learn. Everyone in this industry has come from a different background and seen different things over the years. Combining that knowledge is invaluable to the success of your clients' projects. Don't be afraid to do it and learn from it. This is also a good idea if you feel like your project has reached a stalemate. Getting more advice, identifying potential problems, and having a fresh perspective will do wonders for your success.

As many of the experts have confirmed, ever since the Pigeon update, organic and local ranking factors have been more tied together than ever. Since they started going this direction in a big way, I would not expect it to stop.

This means that you really do need to worry about things like site speed, content, penalties, mobile compatibility, site structure, and more. On a side note, guess what will happen to your organic results if you keep this as a top priority? They will flourish and you will thank me.

If you don't have the budget or resources to get a third party opinion, you can also conduct an independent audit.

Do it yourself local SEO audit resources:

Do it yourself organic SEO audit resources:

Alternatively if you're more in the organic boat you should also check out this guide by Steve Webb on How To Perform The World's Greatest SEO Audit.

Whatever your situation is, it's worth the time to have this perspective yearly or even a couple times a year if possible.

Practical tip 2: consider behavioral signals and optimize accordingly

I remember having a conversation with Darren Shaw, the founder of Whitespark, at MozCon 2013 about his thoughts on user behavior affecting local results. At the time I didn't do too much testing around it. However just this year, Darren had a mind-blowing presentation at the Dallas State of Search where he threw in the behavioral signals curve ball. Phil Rozek also spoke about behavioral signals and provided a great slide deck with actionable items (included below).

We have always speculated on behavioral signals but between his tests and some of Rand's IMEC Lab tests, I became more of a believer last year. Now, before we go too deep on this remember that your local campaign is NOT only focused on just your local pack results. If user behavior can have an impact on search results, we should definitely be optimizing for our users.

You can view Phil Rozek's presentation below:

Don't just optimize for the engines, optimize for the humans. One day when Skynet is around this may not be an issue, but for now you need to do it.

So how can you optimize for behavioral signals?

There is a dark side and a light side path to this question. If you ask me I will always say follow the light side as it will be effective and you don't have to worry about being penalized. That's a serious issue and it's unethical for you to put your clients in that position.

Local SEO: how to optimize for behavioral signals

Do you remember the click-through study we looked at a bit earlier from Nifty Marketing? Do you remember where the users clicked? If you look again or just analyze user and shopper behavior, you might notice that many of the results with the most reviews got clicks. We know that reviews are hard to get so here are two quick ways that I use and recommend to my clients:

1. Solicit your Gmail clients for reviews

If you have a list of happy Gmail clients you can simply send them an email with a direct link to your Google My Business Page. Just get the URL of your local page by pulling up your URL and copying and pasting it. A URL will look like the one below:

Once you have this URL, simply remove the /posts and replace it with:

/?hl=en&review=1

It will look like this:

If your clients click on this link via their logged-in Gmail, they will automatically be taken to the review page which will open up the box to leave a review which looks like the example below. It doesn't get much more simple than that.

2. Check out a service like Mike Blumenthal's Get Five Stars for reviews

I recently used this with a client and got a lot of great feedback and several reviews.

Remember that these reviews will also help on third-party sites and can help your Google My Business ranking positions as well as click-through rates. You can check out Get Five Stars Here.

Another way outside of getting reviews is to optimize the appearance of your Google My Business Page.

3. Optimize your local photos

Your Google My Business page includes photos. Don't use generic photos. Use high quality photos so when the users hover over your listing they get an accurate representation of what they're looking for. Doing this will increase your click-through rate.

Organic SEO: Optimize for Behavioral Signals

The optimization for click-through rates on organic results typically focus on three areas. While you're likely very familiar with the first two, you should not ignore them.

1. Title tags: optimize them for the user and engine

Optimize your meta title tags to increase click-through rates. Each page should have a unique title tag and should help the viewer with their query. The example below (although fabricated) is a good example of what NOT to do.

2. Meta descriptions: optimize them for the user

Optimize your meta description to get the user to click on the search result. If you're not doing this just because Google may or may not pull it, you're missing opportunities and clicks.

3. Review Schema markup: add this to appropriate pages

Reviewing Schema markup is still a very overlooked opportunity. Like we talked about above in the local section, if you don't have reviews coded in Schema, you could be missing out on getting the orange stars in organic results.

Practical tip 3: don't ignore barnacle SEO

I firmly believe that most people are not taking advantage of barnacle SEO still to this day and I'm a big fan. When I first heard Will Scott introduce this term at Pubcon I thought it was spot on. According to Will Scott's website Search Influence, barnacle SEO is "attaching oneself to a large fixed object and waiting for the customers to float by in the current." In a nutshell, we know that if you're trying to rank on page one of Google you will find others that you may be able to attach to. If Yelp results come up for a lot of your search terms you might identify that as an opportunity. But there are three main ways you can take advantage of this.

1. You can try to have the most visible profile on that third party page

If Yelp is ranking for LA Personal Injury Attorneys, it would suit you to figure out how the top users are showing up there. Maybe your customers are headed there and then doing some shopping and making a selection. Or maybe they're using it for a research platform and then will visit your website. If your profile looks great and shows up high on the list, you just gave yourself a better chance at getting a conversion.

2. You can try to get your page to rank

Hey, just because you don't own Yelp.com or whatever similar site you've found, doesn't mean you shouldn't put in the effort to have it rank. If Google is already showing you that they trust a third party site by ranking it, you can use similar organic ranking techniques that you would use on your own site to make your profile page stronger. Over time you might add this to your bio on interviews or other websites to earn links. If you increase the visibility of your profile on search engines and they see your website on the same page you might increase conversions.

3. You can help your Google My Business

If the site you're using passes link juice and you earn links to the third party profile page, you will start to see some strong results. Links are a big factor in local since Pigeon this year and it's an opportunity that should not be missed.

So how can you use this advice?

Start by finding a list of potential barnacle SEO partners for your industry. As an example, I did a search for "Personal Injury Attorneys" in Los Angeles. In addition to the law firms that showed up in the results on the first page, I also identified four additional places I may be able to show up on.

  1. Yelp
  2. Thumbtack
  3. Avvo
  4. Wikipedia

If you were attorney, it would be worth your while to explore these and see if any make sense for you to contribute to.

Practical tip 4: earn some good links

Most people get too carried away with link building. I know because I used to do it. The key with link building is to change your approach to understand that it's always better to get fewer high quality links than hundreds or thousands of low quality links.

For example, a link like this one that one of our clients earned is what I'm talking about.

If you want to increase your local rankings you can do so by earning these links to your associated Google My Business landing page.

Do you know the URL you entered in your Google My Business page when you set it up? That's the one I'm talking about. In most cases this will be linked to either a local landing page for that location or the home page. It's essential to your success that you earn solid links to this page.

Simple resources for link building

Practical tip 5: have consistent citations and remove duplicates

Identifying and correcting incorrect or duplicate citations has been getting easier and easier over the years. Even if you don't want to pay someone to do it, you can sign up for some great do-it-yourself tools. Your goal with any citation cleanup program is this:

  1. Ensure there are no duplicate citations
  2. Ensure there are no incorrect citations with wrong phone numbers, old addresses, etc.

You can ignore small differences and inconsistencies like St vs. Street. I believe the importance of citations has been greatly reduced over the past year. At the same time, you still want to be the least imperfect and provide your customers with accurate information if they're looking on third party websites.

Let's do good things in 2015

2014 was a tough year in search altogether. We had ups like Penguin refreshes and we had downs like the removal of authorship. I'm guessing 2015 will be no different. Staying on the roller coaster and keeping with the idea of having the "least imperfect" site is the best way to ring out the new year and march on moving forward. If you had a tough year in local search, keep your head up high, fix any existing issues, and sprint through this year by making positive changes to your site.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 12th, 2015|MOZ|0 Comments

How to Provide Unique Value in Your Content – Whiteboard Friday

How to Provide "Unique Value" in Your Content - Whiteboard Friday Whiteboard

Posted by randfish

Marketers of all stripes are hearing more about providing unique content and value to their audiences, and how that's what Google wants to show searchers. Unique content is straightforward enough, but what exactly does everyone mean by "unique value?" What does that actually look like? In today's Whiteboard Friday, Rand illustrates the answer.

For reference, here's a still of this week's whiteboard!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat a little about providing unique value in your content. Now I've been known to talk a lot about what you need to do to get to the kind of uniqueness in content that Google wants to index, that searchers want to find, that is likely to earn you amplification and links and all the signals that you'll need to perform well in the rankings, and to perform well on social media and in content marketing of all kinds.

The challenge has been that I've seen a lot of people adopt this attitude around, yes, unique content, unique value, but merge those two and not view them as two different things and really not understand what I mean when I say unique value at all, and it's not just me. A lot of the content marketing and SEO industries are talking about the need for unique value, and they may say other words to describe that. But unfortunately, as an industry, we've not yet coalesced around what that idea means, and so this Whiteboard Friday is to try and explain exactly what a lot of these best practices and experts are talking about when they say "unique value."

Modern criteria for content

So let's start by talking about our modern criteria for content, and I have a slide that I like to show a lot that kind of displays this, and many other folks in the field have as well. So if I'm going to be producing content, I need to meet these five criteria.

One of a kind

One of a kind is basically what we meant when we said old unique content, meaning that the engines have never seen those words and phrases and numbers and visuals and whatever in that order on a page on the web previously. It's been written for the first time, produced and published for the first time. Therefore, it is one of a kind, doesn't appear elsewhere.

Relevant

Relevant meaning it contains content that both searchers and engines interpret as on topic to that searcher's query or their intent. Sometimes you can be on topic to the query, meaning you've used the words and the phrases that the searcher used, and not be on topic to their intent. What did they actually want to get out of the search? What question are they trying to answer? What information are you trying to get?

Helpful

This one's pretty obvious. You should resolve the searcher's query in a useful, efficient manner. That should be a page that does the job that they're hoping that that content is going to do.

Uniquely valuable

This is the one we're going to be talking about today, and what we mean here is provides information that's unavailable or hard to get elsewhere -- I'm going to dive into that a little bit more --

Great user experience

This means it's easy and pleasurable to consume anywhere on any device.

You meet these criteria with your content and you've really got something when it comes to a content marketing strategy or when it comes to content you're producing for SEO. This is a pretty solid checklist that I think you can rely on.

Unique value and you (and your website)

The challenge is this one. Uniquely valuable has been a really hard concept for people to wrap their heads around, and so let's dig in a little more on what we mean when we say "unique value."

So these are kind of the three common criteria that we mean when we say "unique value," and I'm actually going to show some examples as well.

1) Massive upgrade in aggregation, accessibility and design

The first one is a massive upgrade versus what's already available on the web in aggregation, accessibility, and/or design. Meaning you should have someone who views that content say, "Wow. You know, I've seen this material presented before, but never presented so well, never so understandable and accessible. I really like this resource because of how well aggregated, how accessible, how well designed this resource is."

Good examples, there's a blog post from the website Wait But Why on the Fermi Paradox, which is sort of a scientific astrophysics, "why are we alone in the universe" paradox concept, and they do a brilliant job of visualizing and explaining the paradox and all of the potential scenarios behind it. It's so much fun to read. It's so enjoyable. I've read about the Fermi Paradox many times and never been as entranced as I was as when I read this piece from Wait But Why. It really was that experience that says, "Wow, I've seen this before, but never like this."

Another great site that does pure aggregation, but they provide incredible value is actually a search engine, a visual search engine that I love called Niice.co. Not particularly easy to spell, but you do searches for things like letter press or for emotional ideas, like anger, and you just find phenomenal visual content. It's an aggregation of a bunch of different websites that show design and visual content in a search interface that's accessible, that shows all the images in there, and you can scroll through them and it's very nicely collected. It's aggregated in the best way I've ever seen that information aggregated, therefore, providing unique value. Unfortunately, since it's a search engine, it's not actually going to be indexed by Google, but still tremendously good content marketing.

2) Information that is available nowhere else

Number two is information that's available nowhere else. When I say "information," I don't mean content. I don't mean words and phrases. I don't mean it's one-of-a-kind in that if I were to go copy and paste a sentence fragment or a paragraph and plug it into Google, that I wouldn't find that sentence or that paragraph elsewhere. I mean unique information, information that, even if it were written about thousands of different ways, I couldn't find it anywhere else on the web. You want your visitor to have experience of, "Wow, without this site I never would have found the answers I sought." It's not that, "Oh, this sentence is unique to all the other sentences that have been written about this topic." It's, "Ah-ha this information was never available until now."

Some of my favorite examples of that -- Walk Score. Walk Score is a site that took data that was out there and they basically put it together into a scoring function. So they said, "Hey, in this ocean beach neighborhood in San Diego, there are this many bars and restaurants, grocery stores, banks, pharmacies. The walkability of that neighborhood, therefore, based on the businesses and on the sidewalks and on the traffic and all these other things, the Walk Score out of 100 is therefore 74." I don't know what it actually is. Then you can compare and contrast that to, say, the Hillcrest neighborhood in San Diego, where the Walk Score is 88 because it has a far greater density of all those things that people, who are looking for walkability of neighborhoods, are seeking. If you're moving somewhere or you're considering staying somewhere downtown, in area to visit for vacation, this is amazing. What an incredible resource, and because of that Walk Score has become hugely popular and is part of many, many real estate websites and visitor and tourism focused websites and all that kind of stuff.

Another good example, blog posts that provide information that was previously unavailable anywhere else. In our industry I actually really like this example from Conductor. Conductor, as you might know, is an enterprise SEO software company, and they put together a phenomenal blog post comparing which portions of direct traffic are almost certainly actually organic, and they collected a bunch of anonymized data from their platform and assembled that so that we could all see, "Oh, yeah, look at that. Sixty percent of what's getting counted as direct in a lot of these websites, at least on average, is probably coming from organic search or dark social and those kinds of things, and that credit should go to the marketers who acquire that traffic." Fascinating stuff. Unique information, couldn't find that elsewhere.

3) Content presented with a massively differentiated voice or style

The third and final one that I'll talk about is content that's presented with a massively differentiated voice or style. So this is not necessarily you've aggregated information that was previously unavailable or you've made it more accessible or you've designed it in a way to make it remarkable. It's not necessarily information available nowhere else. It's really more about the writer or the artist behind the content creation, and content creators, the great ones, have some artistry to their work. You're trying to create in your visitors this impression of like, "I've seen stuff about this before, but never in a way that emotionally resonated with me like this does." Think about the experience that you have of reading a phenomenal book about a topic versus just reading the Wikipedia entry. The information might be the same, but there are miles of difference in the artistry behind it and the emotional resonance it can create.

In the content marketing world, I really like a lot of stuff that Beardbrand does. Eric from Beardbrand just puts together these great videos. He has this gigantic beard. I feel like I've really captured him here actually. Eric, tell me what you think of this portrait? You're free to use it as your Twitter background, if you'd like. Eric's videos are not just useful. They do contain useful information and stuff that sometimes is hard to find elsewhere, but they have a style to them, a personality to them that I just love.

Likewise, for many of you, you're watching Whiteboard Friday or consuming content from us that you likely could find many other places. Unlike when Moz started, there are many, many great blogs and resources on SEO and inbound marketing and social media marketing, and all these things, but Moz often has a great voice, a great style, at least one that resonates with me, that I love.

Another example, one from my personal life, my wife's blog -- the Everywhereist. There are lots of places you can read, for example, a history of Ireland. But when Geraldine wrote about her not-so-brief history of Ireland, it had a very different kind of emotional resonance for many other people who read and consumed that and, as a result, earned lots of nice traffic and shares and links and all of these kinds of things.

This, one of these three, is what you're aiming for with uniquely valuable, and there are likely some others that fit into these or maybe that cross over between them. But if you're making content for the web and you're trying to figure out how can I be uniquely valuable, see what it is that you're fitting into, which of these themes, hopefully maybe even some combination of them, and is that defensible enough to make you differentiated from your competition, from what else is in the search results, and does it give you the potential to have truly remarkable content and SEO going forward. If not, I'm not sure that it's worth the investment. There's no prize in content for hitting Publish. No prize for hitting Publish. The only prize comes when you produce something that meets these criteria and thus achieves the reach and the marketing goals that you're seeking.

All right, everyone, we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 9th, 2015|MOZ|0 Comments

12 Common Reasons Reconsideration Requests Fail

54adae4a06f5a0.55597466.jpg

Posted by Modestos

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of Moz, Inc.

There are several reasons a reconsideration request might fail. But some of the most common mistakes site owners and inexperienced SEOs make when trying to lift a link-related Google penalty are entirely avoidable.

Here's a list of the top 12 most common mistakes made when submitting reconsideration requests, and how you can prevent them.

1. Insufficient link data

This is one of the most common reasons why reconsideration requests fail. This mistake is readily evident each time a reconsideration request gets rejected and the example URLs provided by Google are unknown to the webmaster. Relying only on Webmaster Tools data isn't enough, as Google has repeatedly said. You need to combine data from as many different sources as possible.

A good starting point is to collate backlink data, at the very least:

  • Google Webmaster Tools (both latest and sample links)
  • Bing Webmaster Tools
  • Majestic SEO (Fresh Index)
  • Ahrefs
  • Open Site Explorer

If you use any toxic link-detection services (e.g., Linkrisk and Link Detox), then you need to take a few precautions to ensure the following:

  • They are 100% transparent about their backlink data sources
  • They have imported all backlink data
  • You can upload your own backlink data (e.g., Webmaster Tools) without any limitations

If you work on large websites that have tons of backlinks, most of these automated services are very likely used to process just a fraction of the links, unless you pay for one of their premium packages. If you have direct access to the above data sources, it's worthwhile to download all backlink data, then manually upload it into your tool of choice for processing. This is the only way to have full visibility over the backlink data that has to be analyzed and reviewed later. Starting with an incomplete data set at this early (yet crucial) stage could seriously hinder the outcome of your reconsideration request.

2. Missing vital legacy information

The more you know about a site's history and past activities, the better. You need to find out (a) which pages were targeted in the past as part of link building campaigns, (b) which keywords were the primary focus and (c) the link building tactics that were scaled (or abused) most frequently. Knowing enough about a site's past activities, before it was penalized, can help you home in on the actual causes of the penalty. Also, collect as much information as possible from the site owners.

3. Misjudgement

Misreading your current situation can lead to wrong decisions. One common mistake is to treat the example URLs provided by Google as gospel and try to identify only links with the same patterns. Google provides a very small number of examples of unnatural links. Often, these examples are the most obvious and straightforward ones. However, you should look beyond these examples to fully address the issues and take the necessary actions against all types of unnatural links.

Google is very clear on the matter: “Please correct or remove all inorganic links, not limited to the samples provided above.

Another common area of bad judgement is the inability to correctly identify unnatural links. This is a skill that requires years of experience in link auditing, as well as link building. Removing the wrong links won't lift the penalty, and may also result in further ranking drops and loss of traffic. You must remove the right links.


4. Blind reliance on tools

There are numerous unnatural link-detection tools available on the market, and over the years I've had the chance to try out most (if not all) of them. Because (and without any exception) I've found them all very ineffective and inaccurate, I do not rely on any such tools for my day-to-day work. In some cases, a lot of the reported "high risk" links were 100% natural links, and in others, numerous toxic links were completely missed. If you have to manually review all the links to discover the unnatural ones, ensuring you don't accidentally remove any natural ones, it makes no sense to pay for tools.

If you solely rely on automated tools to identify the unnatural links, you will need a miracle for your reconsideration request to be successful. The only tool you really need is a powerful backlink crawler that can accurately report the current link status of each URL you have collected. You should then manually review all currently active links and decide which ones to remove.

I could write an entire book on the numerous flaws and bugs I have come across each time I've tried some of the most popular link auditing tools. A lot of these issues can be detrimental to the outcome of the reconsideration request. I have seen many reconsiderations request fail because of this. If Google cannot algorithmically identify all unnatural links and must operate entire teams of humans to review the sites (and their links), you shouldn't trust a $99/month service to identify the unnatural links.

If you have an in-depth understanding of Google's link schemes, you can build your own process to prioritize which links are more likely to be unnatural, as I described in this post (see sections 7 & 8). In an ideal world, you should manually review every single link pointing to your site. Where this isn't possible (e.g., when dealing with an enormous numbers of links or resources are unavailable), you should at least focus on the links that have the more "unnatural" signals and manually review them.

5. Not looking beyond direct links

When trying to lift a link-related penalty, you need to look into all the links that may be pointing to your site directly or indirectly. Such checks include reviewing all links pointing to other sites that have been redirected to your site, legacy URLs with external inbound links that have been internally redirected owned, and third-party sites that include cross-domain canonicals to your site. For sites that used to buy and redirect domains in order increase their rankings, the quickest solution is to get rid of the redirects. Both Majestic SEO and Ahrefs report redirects, but some manual digging usually reveals a lot more.

PQPkyj0.jpg

6. Not looking beyond the first link

All major link intelligence tools, including Majestic SEO, Ahrefs and Open Site Explorer, report only the first link pointing to a given site when crawling a page. This means that, if you overly rely on automated tools to identify links with commercial keywords, the vast majority of them will only take into consideration the first link they discover on a page. If a page on the web links just once to your site, this is not big deal. But if there are multiple links, the tools will miss all but the first one.

For example, if a page has five different links pointing to your site, and the first one includes a branded anchor text, these tools will just report the first link. Most of the link-auditing tools will in turn evaluate the link as "natural" and completely miss the other four links, some of which may contain manipulative anchor text. The more links that get missed this way the more likely your reconsideration request will fail.

7. Going too thin

Many SEOs and webmasters (still) feel uncomfortable with the idea of losing links. They cannot accept the idea of links that once helped their rankings are now being devalued, and must be removed. There is no point trying to save "authoritative", unnatural links out of fear of losing rankings. If the main objective is to lift the penalty, then all unnatural links need to be removed.

Often, in the first reconsideration request, SEOs and site owners tend to go too thin, and in the subsequent attempts start cutting deeper. If you are already aware of the unnatural links pointing to your site, try to get rid of them from the very beginning. I have seen examples of unnatural links provided by Google on PR 9/DA 98 sites. Metrics do not matter when it comes to lifting a penalty. If a link is manipulative, it has to go.

In any case, Google's decision won't be based only on the number of links that have been removed. Most important in the search giant's eyes are the quality of links still pointing to your site. If the remaining links are largely of low quality, the reconsideration request will almost certainly fail.

8. Insufficient effort to remove links

Google wants to see a "good faith" effort to get as many links removed as possible. The higher the percentage of unnatural links removed, the better. Some agencies and SEO consultants tend to rely too much on the use of the disavow tool. However, this isn't a panacea, and should be used as a last resort for removing those links that are impossible to remove—after exhausting all possibilities to physically remove them via the time-consuming (yet necessary) outreach route.

Google is very clear on this:

m4M4n3g.jpg?1

Even if you're unable to remove all of the links that need to be removed, you must be able to demonstrate that you've made several attempts to have them removed, which can have a favorable impact on the outcome of the reconsideration request. Yes, in some cases it might be possible to have a penalty lifted simply by disavowing instead of removing the links, but these cases are rare and this strategy may backfire in the future. When I reached out to ex-googler Fili Wiese's for some advice on the value of removing the toxic links (instead of just disavowing them), his response was very straightforward:

V3TmCrj.jpg

9. Ineffective outreach

Simply identifying the unnatural links won't get the penalty lifted unless a decent percentage of the links have been successfully removed. The more communication channels you try, the more likely it is that you reach the webmaster and get the links removed. Sending the same email hundreds or thousands of times is highly unlikely to result in a decent response rate. Trying to remove a link from a directory is very different from trying to get rid of a link appearing in a press release, so you should take a more targeted approach with a well-crafted, personalized email. Link removal request emails must be honest and to the point, or else they'll be ignored.

Tracking the emails will also help in figuring out which messages have been read, which webmasters might be worth contacting again, or alert you of the need to try an alternative means of contacting webmasters.

Creativity, too, can play a big part in the link removal process. For example, it might be necessary to use social media to reach the right contact. Again, don't trust automated emails or contact form harvesters. In some cases, these applications will pull in any email address they find on the crawled page (without any guarantee of who the information belongs to). In others, they will completely miss masked email addresses or those appearing in images. If you really want to see that the links are removed, outreach should be carried out by experienced outreach specialists. Unfortunately, there aren't any shortcuts to effective outreach.

10. Quality issues and human errors

All sorts of human errors can occur when filing a reconsideration request. The most common errors include submitting files that do not exist, files that do not open, files that contain incomplete data, and files that take too long to load. You need to triple-check that the files you are including in your reconsideration request are read-only, and that anyone with the URL can fully access them.

Poor grammar and language is also bad practice, as it may be interpreted as "poor effort." You should definitely get the reconsideration request proofread by a couple of people to be sure it is flawless. A poorly written reconsideration request can significantly hinder your overall efforts.

Quality issues can also occur with the disavow file submission. Disavowing at the URL level isn't recommended because the link(s) you want to get rid of are often accessible to search engines via several URLs you may be unaware of. Therefore, it is strongly recommended that you disavow at the domain or sub-domain level.

11. Insufficient evidence

How does Google know you have done everything you claim in your reconsideration request? Because you have to prove each claim is valid, you need to document every single action you take, from sent emails and submitted forms, to social media nudges and phone calls. The more information you share with Google in your reconsideration request, the better. This is the exact wording from Google:

“ ...we will also need to see good-faith efforts to remove a large portion of inorganic links from the web wherever possible.”

12. Bad communication

How you communicate your link cleanup efforts is as essential as the work you are expected to carry out. Not only do you need to explain the steps you've taken to address the issues, but you also need to share supportive information and detailed evidence. The reconsideration request is the only chance you have to communicate to Google which issues you have identified, and what you've done to address them. Being honest and transparent is vital for the success of the reconsideration request.

There is absolutely no point using the space in a reconsideration request to argue with Google. Some of the unnatural links examples they share may not always be useful (e.g., URLs that include nofollow links, removed links, or even no links at all). But taking the argumentative approach veritably guarantees your request will be denied.

54adb6e0227790.04405594.jpg
Cropped from photo by Keith Allison, licensed under Creative Commons.

Conclusion

Getting a Google penalty lifted requires a good understanding of why you have been penalized, a flawless process and a great deal of hands-on work. Performing link audits for the purpose of lifting a penalty can be very challenging, and should only be carried out by experienced consultants. If you are not 100% sure you can take all the required actions, seek out expert help rather than looking for inexpensive (and ineffective) automated solutions. Otherwise, you will almost certainly end up wasting weeks or months of your precious time, and in the end, see your request denied.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 7th, 2015|MOZ|0 Comments

Case Study: One Site’s Recovery from an Ugly SEO Mess

Google Panda Site Recovery

Posted by AlanBleiweiss

This past March, I was contacted by a prospective client:

My site has been up since 2004. I had good traffic growth up to 2012 (doubling each year to around a million page views a month), then suffered a 40% drop in mid Feb 2012. I've been working on everything that I can think of since, but the traffic has never recovered.

Since my primary business is performing strategic site audits, this is something I hear often. Site appears to be doing quite well, then gets slammed. Site owner struggles for years to fix it, but repeatedly comes up empty.

It can be devastating when that happens. And now more than ever, site owners need real solutions to serious problems.

As this chart shows, when separating out the "expected" roller coaster effect, Google organic traffic took a nose-dive in early February of 2012.

First step: check and correlate with known updates

When this happens, the first thing I do is jump to Moz's Google Algorithm Change History charts to see if I can pinpoint a known Google update that correlates to a drop.

Except in this case, there was no direct "same day" update listed.

A week before, there's an entry listed that references integrating Panda into the main index more, however discussion around that is this change happened sometime in January. So maybe it's Panda, maybe it's not.

Expand your timeline: look for other hits

At this point, I expanded the timeline view to see if I could spot other specific drops and possibly associate them to known updates. I did this because some sites that get hit once, get hit again and again.

Google Organic Historic Visit View

Well now we have a complete mess.

At this point, if you're up for the challenge, you can take the time to carefully review all the ups and downs manually, comparing drops to Moz's change history.

Personally, when I see something this ugly, I prefer to use the Panguin Tool. It allows you to see this timeline with a "known update" overlay for various Google updates. Saves a lot of time. So that's what I did.

Panguin Tool Historic View

Well okay this is an ugly mess as well. If you think you can pin enough of the drops on specific factors, that's great.

What I like about the Panguin Tool is you can "turn off" or "hide" different update types to try and look for a consistent issue type. Alternately, you can zoom in to look at individual updates and see if they align with a specific, clear drop in traffic.

Zoomed In Panguin Evaluation

Looking at this chart, it's pretty clear the site saw a second dropoff beginning with Panda 3.3. The next dropoff aligned with updates appears to be Panda 3.4, however it was already in a slide after Panda 3.3 so we can't be certain of this one.

Multiple other updates took place after that where there may or may not have been some impact, followed by further cascading downward.

Then, in the midst of THAT dropoff, we see a Penguin update that also MAY or MAY NOT have played into the problem.

The ambiguous reality

This is a great time to bring up the fact that one of the biggest challenges we face in dealing with SEO is the ambiguous nature of what takes place. We can't always, with true certainty, know whether a site has been hit by any single algorithm change.

In between all the known updates, Google is constantly making adjustments.

The other factor here is that when a given update takes place, it doesn't always roll out instantly, nor is every site reprocessed against that latest change right away.

The cascading impact effect

Here's where evaluating things becomes even more of a mess.

When an algorithm update takes place, it may be days or even weeks before a site sees the impact of that, if at all. And once it does, whatever changes to the overall status of a site come along due to any single algorithm shift, other algorithms are sometimes going to then base formulaic decisions on that new status of a site.

So if a site becomes weaker due to a given algorithm change, even if the drop is minimal or non-observant, it can still suffer further losses due to that weakened state.

I refer to this as the "cascading impact" effect.

The right solution to cascading impact losses

Okay so lets say you're dealing with a site that appears to have been hit by multiple algorithm updates. Maybe some of them are Panda, maybe others aren't Panda.

The only correct approach in this scenario is to step back and understand that for maximum sustainable improvement, you need to consider every aspect of SEO. Heck, even if a site was ONLY hit by Panda, or Penguin, or the "Above the Fold" algorithm, I always approach my audits with this mindset. It's the only way to ensure that a site becomes more resilient to future updates of any type.

And when you approach it this way, because you're looking at the "across-the-board" considerations, you're much more likely to address the actual issues that you can associate with any single algorithm.

The QUART mindset

It was at this point where I began to do my work.

A couple years ago, I coined the acronym QUART—what I call the five super-signals of SEO:

  • Quality
  • Uniqueness
  • Authority
  • Relevance
  • Trust

With every single factor across the full spectrum of signals in SEO, I apply the QUART* test. Any single signal needs to score high in at least three of the five super-signals.

Whether it's a speed issue, a crawl efficiency issue, topical focus, supporting signals on-site or off-site, whatever it is, if that signal does not score well with quality, uniqueness or relevance, it leaves that page, that section of a site, or that site as a whole vulnerable to algorithmic hits.

If you get those three strong enough, that signal will, over time, earn authority and trust score value as well.

If you are really strong with relevance with any single page or section of the site, but weak in quality or uniqueness, you can still do well in SEO if the overall site is over-the-top with authority and trust.

*When I first came up with this acronym, I had the sequence of letters as QURTA, since quality, uniqueness, and relevance are, in my opinion, the true ideal target above all else. New sites don't have authority or trust, yet they can be perfectly good, valuable sites if they hit those three. Except Jen Lopez suggested that if I shift the letters for the acronym, it would make it a much easier concept for people to remember. Thanks Jen!

A frustratingly true example

Lets say you have one single page of content. It's only "okay" or may even be "dismal" in regard to quality and uniqueness. Even if you do, and if the site's overall authority and trust are strong enough, you can outrank an entire site devoted to that specific topic.

This happens all the time with sites like Wikipedia, or Yahoo Answers.

Don't you hate that? Yeah, I know—Yahoo Answers? Trust? Ha!

Sadly, some sites have, over time, built so much visibility, brand recognition, and trust for enough of their content, that they can seemingly get away with SEO murder.

It's frustrating to see. Yet the foundational concept as to WHY that happens is understandable if you apply the QUART test.

Spot mobile site issues

One challenge this site has is that there's also a separate mobile subdomain. Looking at the Google traffic for that shows similar problems, beginning back in February of 2012.

Mobile Site Historic Google Traffic

Note that for the most part, the mobile site suffered from that same major initial hit and subsequent downslide. The one big exception was a technical issue unique to the mobile site at the end of 2012 / beginning of 2013.

Identify and address priority issues

Understanding the QUART concept, and having been doing this work for years, I dove head-first into the audit.

Page processing and crawl efficiency

Before and after Page Speeds
NOTE: This is an educational site – so all "educational page" labels refer to
different primary pages on the site.

For my audits, I rely upon Google Analytics Page Timings data, URIValet.com 1.5 mbps data, and also WebPageTest.org (testing from different server locations and at different speeds including DSL, Cable and Mobile).

Speed improvement goals

Whenever I present audit findings to a client, I explain "Here's the ideal goal for this issue, yet I don't expect you to hit the ideal goal, only that you do your best to make improvements without becoming bogged down in this one issue."

For this site, since not every single page had crisis speed problems, I was looking to have the site owner at least get to a point of better, more consistent stability. So while there's still room for vast improvement, the work performed went quite far in the right direction.

Speed issues addressed: domain and process calls

The first issue tackled was the fact that at the template level, the various pages on the site were calling several different processes across several different domains.

A great resource to use for generating lists of what third party processes individual pages use that I rely upon is a report in the WebPageTest.org results. It lists every domain called for the page tested, along with total processes called from those, and gives separate data on the total file sizes across each.

Reducing the number of times a page has to call a third party domain, and the number of times an individual process needs to be run is often a way to help speed up functionality.

Improving third partry domain requests

In the case of this site, several processes were eliminated:

  • Google APIs
  • Google Themes
  • Google User Content
  • Clicktale
  • Gstatic
  • RackCDN

By eliminating functionality that was dependent upon third party servers meant less DNS lookups, and less dependance upon connections to other servers somewhere else on the web.

Typical service drains can often come from ad blocks (serving too many ads from too many different ad networks is a frequent speed drain culprit), social sharing widgets, third party font generation, and countless other shiny object services.

Clean code

Yeah, I know—you don't have to have 100% validated code for SEO. Except what I've found through years of this work, is that the more errors you have in your markup, the more likely there will be potential for processing delays, and beyond that, the more likely search algorithms will become confused.

And even if you can't prove in a given site that cleaner code is a significant speed improvement point, it's still a best practice, which is what I live for. So it's always included in my audit evaluation process.

HTML Markup Improvements

Improve process efficiency

Next up on the list was the range of issues all too many sites have these days regarding efficiency within a site's own content. Tools to help here include Google Webmaster Tools, Google Page Speed Insights, and again WebPageTest.org among others.

Issues I'm talking about here include above-the-fold render-blocking JavaScript and CSS, lack of browser caching, lack of compression of static content, server response times, a host of code-bloat considerations, too-big image sizes, and the list goes on...

Google Page Speed Insights Grades
NOTE: This is an educational site – so all "educational page" labels refer to
different primary pages on the site.

Note: Google Page Speed Insights recommendations and WebPageTest.org's grade reports only offer partial insight. What they do offer however, can help you go a long way to making speed improvements.

Also, other speed reporting tools abound, to differing degrees of value, accuracy and help. The most important factor to me is to not rely on any single resource, and do your own extensive testing. Ultimately, enough effort in research and testing needs to be performed with followup checking to ensure you address the real issues on a big enough scale to make a difference. Just glossing over things or only hitting the most obvious problems is not always going to get you real long-term sustainable results...

Correct crawl inefficiency

Another common problem I find is where a site evolves over time, many of the URLs change. When this happens, site owners don't properly clean up their own internal links to those pages. The end result is a weakening of crawl efficiency and then user experience quality and trust signals.

Remember that Google and Bing are, in fact, users of your site. Whether you want to admit it or not. So if they're crawling the site and run into too many internal redirects (or heaven forbid redirect loops), or dead ends, that's going to make their systems wary to want to bother continuing the crawl. And abandoning the crawl because of that is not helpful by any stretch of the imagination.

It also confuses algorithms.

To that end, I like to crawl a sampling of a site's total internal links using Screaming Frog. That tool gives me many different insights, only one of which happens to be internal link problems. Yet it's invaluable to know. And if I find enough of a percentage of that sample crawl URLs are redirecting or dead ends, that needs to get fixed.

Eliminate crawl inefficiency

Note: for reference sake, the total number of pages on the entire site is less than 5,000. So that's a lot of internal inefficiency for that size site...

Don't ignore external links

While having link redirects and dead ends pointing to outside sources isn't ideal, it's less harmful most of the time than internal redirects and dead ends. Except when it's not.

In this case, the site had previously been under a different domain name prior to a rebranding effort. And after the migration, it resulted in some ugly redirect loops involving the old domain!

Redirect Loop

Topical focus evaluation

At this point, the audit moved from the truly technical issues to the truly content related issues. Of course, since it's algorithms that do the work to "figure it all out," even content issues are "technical" in nature. Yet that's a completely different rant. So let's just move on to the list of issues identified that we can associate with content evaluation.

Improve H1 and H2 headline tags

Yeah, I know—some of you think these are irrelevant. They're really not. They are one more contributing factor when search engines look to multiple signals for understanding the unique topic of a given page.

Noindex large volume of "thin" content pages

Typical scenario here: a lot of pages that have very little to no actual "unique" content—at least not enough crawlable content to justify their earning high rankings for their unique focus. Be aware—this doesn't just include the content within the main "content" area of a page. If it's surrounded (as was the case on this site) by blocks of content common to other pages, or if the main navigation or footer navigation are bloated with too many links (and surrounding words in the code), and if you offer too many shiny object widgets (as this site had), that "unique" content evaluation is going to become strained (as it did for this site).

Add robust crawlable relevant content to video pages

You can have the greatest videos on the planet on your site. And yet, if you're not a CNN, or some other truly well established high authority site, you are almost always going to need to add high quality, truly relevant content to pages that have those videos. So that was done here.

And I'm not just talking about "filler" content. In this case (as it always should be) the new content was well written and supportive of the content in the videos.

Eliminate "shiny object" "generic" content that was causing duplicate content / topical dilution confusion across the site

On pages that were worth salvaging but where there was thin content, I never recommend throwing those out. Instead, take the time to add more value content, yes. But also, consider eliminating some of those shiny objects. For this site, the reduction of those vastly improved the uniqueness of those pages.

Improve hierarchical URL content funnels reducing the "flat" nature of content

Flat architecture is an SEO myth. Want to know how I know this? I read it on the Internet, that's how!

The Flat Architecture Myth

Oh wait. That was ME who said it.

Seriously though. If all your content looks like www.domain.com/category and www.domain.com/category and www.domain.com/category that's flat architecture.

It claims "every one of these pages is as important as every other page on my site.

And that's a fantasy.

It also severely harms your need to communicate "here's all the content specific to this category, or this sub-category". And THAT harms your need to say "hey, this site is robust with content about this broad topic".

So please. Stop with the flat architecture.

And no, this is NOT just for search engines. Users who see proper URL funnels can rapidly get a cue as to where they are on the site (or as they look at that in the search results, more confidence about trust factors).

So for this site, reorganization of content was called for and implemented.

Add site-wide breadcrumb navigation

Yes—breadcrumbs are helpful. because they reinforce topical focus, content organization, and improvements to user experience.

So these were added. Noindex,nofollowed over 1,300 "orphaned" pages

Pop-up windows. They're great for sharing additional information to site visitors. Except when you allow those to become indexable by search engines. Then all of asudden, you've got countless random pages that, on their own, have no meaning, no usability, and no way to communicate "this is how this page relates to all these other pages over there". They're an SEO signal killer. So we lopped them out of the mix with a machete.

Sometimes you may want to keep them indexable. If you do, they need full site navigation and branding, and proper URL hierarchical designations. So pay attention to whether it's worth it to do that or not.

Remove UX confusing widget functionality

One particular widget on the site was confusing from a UX perspective. This particular issue had as much to do with site trust and overall usability as anything, and less to do with pure SEO. Except it caused some speed delays, needless site-jumping, repetition of effort and a serious weakening of brand trust. And those definitely impact SEO, so it was eliminated.

Noindex internal site "search" results pages

Duplicate content. Eliminated. 'nuff said?

Eliminate multiple category assignments for blog articles

More duplicate content issues. Sometimes you can keep these, however if multiple category assignments get out of hand, it really IS a duplicate content problem. So in this case, we resolved that.

Unify brand identity across pages from old branding that had been migrated

Old brand, new brand—both were intermingled after the site migration I previously described. Some of it was a direct SEO issue (old brand name in many page titles, in various on-site links and content) and some was purely a UX trust factor.

Unify main navigation across pages from old branded version that had been migrated

Again, this was a migration issue gone wrong. Half the site had consistent top navigation based on the new design, and half had imported the old main navigation. An ugly UX, crawl and topical understanding nightmare.

Add missing meta descriptions

Some of the bizarre auto-generated meta descriptions Google had been presenting on various searches was downright ugly. Killed that click-block dead by adding meta descriptions to over 1,000 pages.

Remove extremely over-optimized meta keywords tag

Not a problem you say? Ask Duane Forrester. He'll confirm—it's one of many signal points they use to seek out potential over-optimization. So why risk leaving them there?

About inbound links

While I found some toxic inbound links in the profile, there weren't many on this site. Most of those actually disappeared on their own thanks to all the other wars that continue to rage in the penguin arena. So for this site, no major effort has yet gone into cleaning up the small number that remain.

Results

Okay so what did all of this do in regard to the recovery I mention in the title? You tell me.

And here's just a small fraction of the top phrase ranking changes:

Ranking Improvements After Audit

Next steps

While the above charts show quite serious improvements since the implementation was started, there's more work that remains.

Google Ad Scripts continue to be a big problem. Errors at the code level and processing delays abound. It's an ongoing issue many site owners struggle with. Heck—just eliminating Google's own ad server tracking code has given some of my clients as much as one to three seconds overall page processing improvement depending on the number of ad blocks as well as intermittent problems on Googles ad server network.

Except at a certain point, ads are the life-blood of site owners. So that's a pain-point we may not be able to resolve.

Other third party processes come with similar problems. Sometimes third party "solution" providers are helpful to want to improve their offerings, however the typical answer to "your widget is killing my site" is "blah blah blah not our fault blah blah blah" when I know for a fact from countless tests, that it is.

So in this case, the client is doing what they can elsewhere for now. And ultimately, if need be, will abandon at least some of those third parties entirely if they can get a quality replacement.

And content improvements—there's always more to do on that issue.

Bottom line

This is just one site, in one niche market. The work has been and continues to be extensive. It is, however, quite typical of many sites that suffer from a range of issues, not all of which can be pinned to Panda. Yet where ignoring issues you THINK might not be Panda specific is a dangerous game, especially now in 2015, where it's only going to get uglier out there...

So do the right thing for the site owner / your employer / your client / your affiliate site revenue...


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 7th, 2015|MOZ|0 Comments