MOZ

When Is a Blog the Right Form of Content Marketing?

Posted by Isla_McKetta

You've heard the wisdom:

"Your business should have a blog."

"Blogging helps your SEO."

"Why aren't you blogging yet?"

According to the experts, a blog will solve all your Internet woes. Blogging will increase your traffic, expand your audience, improve your engagement, position you as an authority, and allow you to shape the message in your space.

In fact, blogging is so hyped as a panacea, you'd think that simply adding a blog to your site would also help you find the perfect spouse, cure the common cold, and even turn lead into gold.

While I won't deny the power of a good blog on the right site (seriously, as a writer, I'm pro-blog in general) to do all of those good things and more, you should always question anything that's touted as the right answer for everyone (and everything). So should you blog?

When a blog is NOT necessarily the right form of content marketing

Now that you're asking whether all that time and energy you're putting (or planning to put) into your blog is really the right investment, let's look at a few examples of when blogging is a bad idea (or is simply unnecessary).

1. You own your market

Johnson & Johnson. Amazon. Target. Google. These companies have already captured the hearts and minds of so many consumers that their names are nearly synonymous with their products. Here's why blogging would only offer each of them a marginal benefit.

Traffic

Does Johnson & Johnson really care about traffic to its site when you already have Band-Aids (and all their other name brand products) in your medicine cabinet? Sure, they produce infographics, but there's no real blog, and you were going to buy their products anyway, right?

Audience reach

Ordering anything from books to pet-waste bags online? You didn't need a blog to discover Amazon, it's so ingrained in your Internet history that you probably went straight there and those products will be on your doorstep in two days or less.

Engagement

Target mastered engagement when Oprah and Tyra started referring to the store as Tarzhay and shoppers only got more loyal as they added designer labels at discount prices. It didn't matter that most of their products weren't even available on their website, let alone that they didn't have a blog. Their site has gotten a lot better in the past decade, but they still don't need a blog to get customers in the door.

Authority

And Google… Sure they have a blog, but Google is such an authority for search queries that most of the consumers of their search results have no interest in, or need for, the blog. So if you have little or no competition or your business is (and you expect it to remain) the top-of-mind brand in your market, you can skip blogging.

2. You have a better way of getting customers into the top of your funnel

A blog is only one way to attract new customers. For example, I live less than a mile from the nearest grocery store, and I can get there and back with a spare stick of butter before my oven even warms up. If the next nearest store had the most amazing blog ever, I'm still not going to go there when I'm missing an ingredient. But if they send me a coupon in the mail, I might just try them out when it's less of an emergency.

The point is that different types of businesses require different types of tactics to get customers to notice them.

My mom, a small-town accountant who knows all of her clients by name, doesn't blog. She's much more likely to get recommended by a neighbor than to be found on the Internet. If paid search brings you $50k in conversions every month and your blog contributes to $10k, it's easy (and fair) to prioritize paid search. If you find that readers of white papers are the hottest leads for your SaaS company, offering a 50:1 ROI over blog readers, write those white papers. And if your customers are sharing your deals across email and/or social at a rate that your blog has never seen, give them more of what they want.

None of that means you'll never have to create a blog. Instead, a blog might be something to reassess when your rate of growth slows in any of those channels, but if you've crunched your numbers and a blog just doesn't pan out for now, use the tactics your customers are already responding to.

3. The most interesting things about your business are strictly confidential (or highly complicated)

Sure the CIA has a blog, but with posts like "CIA Unveils Portrait of Former Director Leon E. Panetta" and "CIA Reaches Deep to Feed Local Families" it reads more like a failed humanizing effort than anything you'd actually want to subscribe to (or worse, read). If you're in a business where you can't talk about what you do, a blog might not be for you.

For example, while a CPA who handles individual tax returns might have success blogging about tips to avoid a big tax bill at year end, a big four accounting firm that specializes in corporate audits might want to think twice about that blog. Do you really have someone on hand who has something new and interesting to say about Sarbanes Oxley and has the time to write?

The difference is engagement. So if you're in a hush-hush or highly technical field, think about what you can reasonably write about and whether anyone is going to want (or legally be able) to publicly comment on or share what you're writing.

Instead, you might want to take the example of Deloitte which thinks beyond the concept of your typical blog to create all kinds of interesting evergreen content. The result is a host of interesting case studies and podcasts that could have been last updated three years ago for all it matters. This puts content on your site, but it also allows you to carefully craft and vet that content before it goes live, without building any expectation associated with an editorial calendar.

4. You think "thought leadership" means rehashing the news

There is a big difference between curating information and regurgitating it. True life confession: As much as I hate the term "thought leader," I used it many a time in my agency days as a way to encourage clients to find the best in themselves. But the truth is, most people don't have the time, energy, or vision to really commit to becoming a thought leader.

A blog can be a huge opportunity to showcase your company's mastery and understanding of your industry. But if you can't find someone to write blog posts that expand on (or rethink) the existing knowledge base, save your ink.

Some people curate and compile information in order to create "top 10" type posts. That kind of content can be helpful for readers who don't have time to source content on their own, but I wouldn't suggest it as the core content strategy for a company's blog. If that's all you have time for, focus on social media instead.

5. Your site is all timely content

A blog can help you shape the message around your industry and your brand, but what if your brand is built entirely around messaging? The BBC doesn't need a blog because any reader would expect what they're reading to be timely content and to adhere to the BBC's standard voice. If readers want to engage with the content by commenting on the articles, they can.

If you can explain the value that blogs.foxnews.com adds to the Fox News site, you've got a keener eye for content strategy than I do. My guess, from the empty blog bubbles here, is that this is a failed (or abandoned) experiment and will soon disappear.

6. Your business is truly offline

There's one final reason that blogging might not fit your business model, and that's if you have chosen not to enter the digital realm. I had lunch with a high-end jeweler in India recently where he was debating whether to go online (he was worried that his designs might get stolen) or continue to do business in person the way his family had done for at least three generations.

If you are successful at selling your products offline, especially if your product has as much variation as a gemstone, an argument can be made for staying offline entirely.

When you should be blogging

Now that we've looked at some times it's okay not to have a blog, let's take a quick, expanded look at five reasons you might want to blog as part of your content marketing strategy (just in case you thought you'd gotten off scot-free by almost fitting into one of the boxes above).

1. You want traffic to your website

Conventional wisdom goes that the more pages you build, the more chances you have to rank. Heck, the more (good) content you create on your blog, the more collateral you have to showcase on your social channels, in email, and anywhere else you want to.

2. You want to expand your audience

If the content you're creating is truly awesome, people will share it and find it and love it. Some of those people will be potential customers who haven't even heard of you before. Keep up the excellence and you might just keep them interested.

3. You want to connect with customers

That blog is a fantastic place to answer FAQs, play with new ideas, and show off the humanity of all those fantastic individuals you have working for you. All of those things help customers get to know you, plus they can engage with you directly via the comments. You might just find ideas for new campaigns and even new products just by creating that venue for conversation.

4. You have something to add to the discussion

Do you really have a fresh perspective on what's going on in your industry? Help others out by sharing your interesting stories and thoughtful commentary. You're building your authority and the authority of your company at the same time.

5. You're ready to invest in your future

Content is a long game, so the payoffs from blogging may be farther down the road than you might hope. But if a blog is right for your company, you're giving yourself the chance to start shaping the message about your industry and your company the day you publish your first post. Keep at it and you might find that you start attracting customers from amongst your followers.

The gist

Don't blog just because someone told you to. A blog is a huge investment and sustaining that blog can take a lot of work. But there are a lot of good reasons to dig in and blog like you mean it.

What's your decision? Do you have a good reason that you've decided to abstain from blogging? Or have you decided that a blog is the right thing for your business? Help others carefully consider their investment in blogging by sharing your story in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 6th, 2014|MOZ|0 Comments

How to Perform the Ultimate Local SEO Audit

Posted by Casey_Meraz

Every business that competes in a local market and who competes for the display of localized results in SERPs will likely find the need to conduct a local SEO audit at some point. Whether you've hired an SEO in the past or not, the best way to beat the competition is to know where you stand and what you need to fix, and then develop a plan to win based on the competitive landscape.

While this may seem like a daunting task, the good news is that you can do this for your business or your client using this Ultimate Local SEO audit guide.

This guide was created as a complete checklist and will show you what areas you should focus on, what needs to be optimized, and what you need to do to fix any problems you encounter. To make things easier, I have also included many additional resources for further reading on the topics below.

In this guide I am going to cover the top areas we review for clients who either want to know how they can improve or the ones that need a local SEO audit. To make it easier I have included detailed explanations of the topics and an Excel template you can use to conduct the audit.

Also since the Pigeon update, local search has started to weigh organic factors more heavily so I have included them in this audit. However, if after you have read this you're looking for an even deeper audit for Organic SEO, you should also check out Steve Webb's article, " How to Perform the World's Greatest SEO Audit."

Who is this guide for?

This guide is intended for those businesses that already have an existing Google My Business page. It's also mostly geared towards brick and mortar stores. If you don't have a public address and you're a service area business, you can ignore the parts where I mention publishing your physical address. If you don't have a listing setup already, it's a little bit harder to audit. That being said, new businesses can use this as a road map.

What we won't cover

The local algorithm is complicated and ever evolving. Although we can look at considerations such as proximity to similar businesses or driving directions requests, I have decided to not include these since we have limited control over them. This audit mainly covers the items the website owner is in direct control over.

A little background

Being ready and willing to adopt change in online marketing is an important factor in the path of success. Search changes and you have to be ready to change with it. The good news is that if you're constantly trying to do the right thing while be the least imperfect, your results will only get better with updates.

Some goons will always try to cheat the systems for a quick win, but they will get caught and penalized eventually. However, if you stick with the right path you can sleep easier at night knowing you don't have to worry about penalties.

But why are audits so important?

At my company we have found through a lot of trial and error that we can provide the best results for our clients when we start a project off with a complete and full understanding of the project as opposed to just bits and pieces. If we have a complete snapshot of their SEO efforts along with their competition we can create a plan that is going to be much more effective and sustainable.

We now live in a world where marketers not only need to be forward thinking with their strategies but they must also evaluate and consider the work done by prior employees and SEOs who have worked on the website in the past. If you don't know what potential damage has been done, how could you possibly be sure your efforts will help your client long term?

Given the impact and potential severity of penalties, it's irresponsible to ignore this or participate in activities that can harm the client in the long run. Again, sadly, this is a lesson I have learned the hard way.

What aspects does this local SEO audit cover?

Knowing what to include in your audit is a great first step. We have broken our audit down into several different categories we find to be essential to local SEO success. They are:

1) Google My Business page audit

2) Website & landing page audit

3) Citation analysis

4) Organic link & penalty analysis

5) Review analysis

6) Social analysis

7) Competition analysis

8) Ongoing strategy

Analyzing all of these factors will allow you to develop a strategy with a much better picture of the major problems and what you're up against as far as the competition is concerned. If you don't have the full picture with all of the details, then you might uncover more problems later.

Before we get started, a disclaimer

In this guide I am going to try to break things down to make it easy for beginners and advanced users. That being said, it's a wise idea to seek advice or read more about a topic if you don't quite understand it. If something is over your head, please don't hesitate to reach out for clarification. It's always better to be safe than sorry.

How to use this guide for your local SEO audit

This guide is broken up into two parts including this post and a spreadsheet. The written part that you are reading now will also correspond to this spreadsheet which will allow you to collect pertinent client intake information, record problems, and serve as an easy reference as to what your ultimate goal is for each of the items.

To use the spreadsheet you can click the link and then go to File > Make A Copy.

The complete spreadsheet includes five tabs that each serve a different purpose. They are:

Current info - This tab allows you to record the information the customer submits and compare it against the Google My Business information you find. It also allows you to record your notes for any proposed changes. This will help you when it comes time to report on your findings.

Questions to ask - These are some basic questions you can ask your clients up front that may save a lot of time in the long run.

Competitor information - You can use this tab to track your competitors and compare your metrics side by side.

Top 50 citations audit - This is the list of the top 50 citation sources as provided by Whitespark.

Audit steps - For the more advanced user I took everything in this long document and condensed it to this easy to use spreadsheet with an audit checklist and some small notes on what you're checking for.

Get your audit shoes on. Now let's get started

Step 1: Gather the facts

Whether you're conducting this audit for a client or your own business it's important to start off with the right information. If clients fill out this information properly, you can save a lot of time and also help identify major issues right off the bat. Not only can we help identify some of the common local SEO issues like inconsistent NAP with this information, we can also have it recorded in the spreadsheet I mentioned above.

Since this is an audit, the spreadsheet has information to include the current information and a column for proposed changes for the client. Later, these will be used as action items.

The first tab in this spreadsheet has everything we need to get started under the company information tab. This includes all of the basic information we will need to be successful.

This information should be provided by the client up front so that we can compare it to the information already existing on the web. You can use the audit spreadsheet and enter this under the "Provided Information" column. This will help us identify problems easily as we collect more information.

The basic information we will need to get started will include NAP information and other items. A sample of this can be seen below:

Questions to ask up front

Once we have the basic company information we can also ask some questions. Keep in mind that the goal here is to be the least imperfect. While some of these factors are more important than others, it's always good to do more and have a better understanding of the potential issues rather than taking shortcuts. Shortcuts will just create more work later.

Feel free to edit the spreadsheet and add more questions to your copy based on your experience.

1) Have you ever been penalized or think you may have been? The client should have a good idea if they were penalized in the past.
2) Have you ever hired anyone to build citations for you? If they hired anyone to build citations for them they should have some documentation which will make the citation audit portion of the audit easier.
3) Have you ever hired an SEO company to work with you? If they hired an SEO in the past it's important to check any work they completed for accuracy.
4) Have you ever hired anyone to build links for you? If they have hired anyone in the past to build links they will hopefully have documentation you can review. If you see bad links you know you will have your work cut out for you.
5) What are the primary keywords you want to rank for? Knowing what the client wants and developing a strategy based off this is essential to your local SEO success.
6) Have you ever used another business name in the past? Companies that used a different name or that were acquired can lead to NAP inconsistencies.
7) Is your business address a PO Box? PO Boxes and UPS boxes are a no no. It's good to know this up front before you get started.
8) Is your phone number a land line? Some Local SEOs claim that landlines may provide some benefit. Regardless it's good to know where the phone number is registered.
9) Do other websites 301 redirect to your website? If other websites redirect to their domain you may need to do an analysis on these domains as well. Specifically for penalty evaluation.
10) Did you ever previously use call tracking numbers? Previously used call tracking numbers can be a nightmare as far as local SEO is concerned. If a client previously used call tracking numbers you will want to search for these when we get to the citation portion of this document. Cleaning up wrong phone numbers, including tracking numbers, in the local ecosystem is essential to your local success.


Local SEO audit phase 1: Google My Business page

The new Google My Business Dashboard has a lot of useful information. Although I reference the Google Guidelines below, be sure to check them often. Google does change these sometimes and you won't really get any official notice. This happened rather recently when they started allowing descriptive words in the business name. Keep in mind that if any changes were recently made to your Google My Business page they may not show in the live version. It may take up to three days for these to show in the search results.

Any information collected below should be put in the "Current Info" tab on the spreadsheet under the Google My Business Information. This will also help us identify discrepancies right away when we look at the spreadsheet.

1. Locate the proper Google My Business page we should be working with

We can't really get started with an audit unless we know the proper page we're working this. Usually if a client hires you they already have this information.

How to do this: If your client already has a Google My Business login, and log in to their dashboard using the proper credentials. In the back end of the dashboard it should show the businesses associated with this account. Copy this URL and confirm with the business owner that this is the page they intend to use. If it's not their primary one we will correct this a bit later below.

Goal: We want to find and record the proper Google My Business URL in our Local SEO Audit Spreadsheet.


2. Find and destroy duplicate pages

Duplicate Google My Business listings can be one of the greatest threats to any local SEO campaign.

How to: There are several ways to find possible duplicate pages but I have found the easiest way is to use Google MapMaker. To do this log in to your Google account and visit http://www.google.com/mapmaker or http://plus.google.com/local. From this page you can search the business phone number such as 555-555-5555 or the business names. If you see multiple listings you didn't know about, a major priority is to record those URLs and delete them.

I personally see a lot of issues when dealing with attorneys where each attorney has their own profile or in the case where an office has moved. There should only be one listing and it should be 100% correct.

You can also read my previous MOZ article.

Goal: Make sure there are no duplicate listings. Kill any duplicates.


3. Ensure that the local listing is not penalized (IMPORTANT!)

Figuring out Google penalties in the local landscape is not usually a walk in the park. In fact there are a lot of variables to consider and now this is a bigger deal post Pigeon as more organic signals are involved. We will look at other types of penalties later in this guide. Unlike organic penalties Google does not notify businesses of local penalties unless your account is suspended with a big red warning on the back end of your My Business page.

According to Phil Rozek from Local Visibility System "My first must-look-at item is: is the client's site or Google Places page being penalized, or at risk of getting penalized?"

How to do this: If your keyword is "Los Angeles personal injury attorney" then you could search for this keyword on Google Maps and Google Search results. If your business listing appears on the maps side in position C for example but then does not appear at all in local search results performing a normal Google Search, then it's likely there is a penalty in place. Sometimes you see listings that are not suppressed on the maps side but are suppressed on the places side. This is an easy way to take a look.

Goal: Do your best to determine that the listing is not penalized. If it is consult a penalty expert for further guidance.


4. Is the Google My Business page associated with an email address on the customer's domain?

In my experience it's best practice to have the login information for the business under an email address associated with the domain name. Additionally this ensures that the client has primary control of their listing. As an example if you run Moz.com and had local listings your Google My Business login should be something@moz.com instead of something@gmail.com. This helps associate that you are indeed the business owner.

How to: If someone else owns your Google My Business page you can transfer it to yourself. Read Google's Transfer Ownership guide.

Goal: The Google My Business Login should be on an email address on the customers domain.


5. Is the page verified?

Ensuring your Google My Business page is verified is essential if you want to take full advantage of your business listing. When you log in to your dashboard you will be able to see right away if its verified or not. If it is not yet verified you will want to verify it using the available method. Usually these are by phone or by postcard. Typically the postcard option will take about a week to show up in the mail.

If you have not yet claimed your page you should be able to see this. Once you find the page you can click on the About tab and scroll to the bottom of the page where you will find the heading "Is This Your Business". From there you can click "Manage this page" and go through the process.

How to: You can verify your page from the back end of the Google My Business Dashboard.

Goal: The page must be claimed & verified.


6. Is the correct business name used?

It's crucial that the Business Name, Address, and Phone Number are as consistent as possible across the web. Google also now permits a one word descriptor of your business in the name of the business. Make sure you are only using your actual business name.

According to Google's Guidelines "You should represent your business exactly as it appears in the offline world". In addition to this they go on to say "you may include a single descriptor that helps customers locate your business or understand what your business offers". Don't add keywords just to spam the search results. You will get caught and penalized. Here is the link to their local business name guidelines where you can read more.

How to: You can change this from the back end of the Google My Business dashboard once it's verified. Changing pertinent information may require a re-verification.

Goal: Ensure the proper business name is used and not inappropriately keyword stuffed. Add this information to the company information tab on your spreadsheet.


7. Is the correct address used?

This should also be consistent with the US Post Office and should be complete and accurate. According to Google's guidelines on addresses they should only contain information that is part of your official address. Don't add cross streets or other information in this section.

Google has a whole page dedicated to their address guidelines here.

In addition to this you need to know that PO Boxes and UPS Boxes are not allowed. Virtual offices are also a no no. If they don't have an office there it could lead to a penalty.

Goal: Your address should be 100% complete and accurate. Suite numbers should be on address line two. Add this information to the company information tab on the spreadsheet. Make sure that no PO Boxes or UPS boxes are used.


8. Is the correct phone number used?

The local phone number for the corresponding office location should be used. Don't use an 800 number as the primary number as it's not best practice. This number should include a local area code.

How to: This can be changed in the Google My Business dashboard for your listing.

Goal: The businesses published main number (not a toll free or tracking number) should be used here. Add this information to the company information tab on the spreadsheet.


9. Proper category association (IMPORTANT!)

Using the correct categories for your business is essential. Custom categories are no longer allowed. You should be using all categories that are allowed for your industry. According to Google's guidelines you should "Add categories which describe what your business is, and not what it does."

In addition to this, Darren Shaw from Whitespark says "I think the most important thing in any local SEO audit is getting the categories right on the Google listing. I have seen a listing jump 7 spots just from a simple change of the primary category, and I have seen a listing completely disappear when an unrelated category was accidentally set. You want to make sure your primary category is the one that most closely represents your most important keyword, and you want to be careful to keep the other categories related to the main service(s)."

How to: This can be changed in the Google My Business dashboard for your listing.

Goal: Ensure your primary category your main category. Use all categories that fit in these guidelines.


10. Email address

Under the Contact Information box in the Google My Business dashboard you will find the email address setting. Make sure there is a public email address here where customers can contact you. This email should be on your domain.

How to: This can be changed in the Google My Business dashboard for your listing.

Goal: Make sure this is filled out with a public email address on the client's domain name.


11. Proper URL

If you have a single location business it would be appropriate to use the home page for your company. However if you have multiple locations then best practice would be to use the landing page for that particular location. Keep in mind that since the Pigeon update it's also important to weigh organic signals. If you have poor site structure you can also shoot yourself in the foot by doing this if there is no authority passed to your location landing page. This is the Website URL field in the Google My Business Dashboard.

Good URL Examples for landing pages:

www.MyDomain.com/denver/

Goal: The My Business listing should link to the page on your website that provides the best user experience.


12. Introduction description

According to Google's guidelines you should use this field to "Add a brief description of your business here. This is where you can introduce yourself to your customers and teach them about your business." Check that this description is also unique content by copying and pasting it into Google.

Goal: You should have a non spammy introduction description. This should be unique content and be over 250 words if possible.


13. Profile completeness

Your Google My Business profile should be 100% complete. If it's not make sure to record the action items to get them taken care of.

Goal: The profile marker at the top of the dashboard should show 100% complete


14. Map & search photos

Photos play an important role in the carousel, and if a customer clicks through, they should see an accurate representation of your business in a very professional manner. If this is not the case then it should be fixed by providing better pictures.

Goal: Upload the best quality photos. If they suck, identify this as a problem.


15. Business hours

Are the business hours filled out correctly and completely for this location? They should be 100% filled out and accurate.

How to: These can be added from the Google My Business back end.

Goal: Business hours should be filled out and accurate.


16. Posts on G+

Posting to your page is a good way to show that you are active on your page and your business. Posting regularly is an important step that's often overlooked.

How to: Post to your Google My Business page.

Goal: Make sure the business owner is posting consistently to their page. Preferably they should post at least weekly.


17. Trusted photographer used?

Google has always been looking for ways to validate that a business exists at their posted location. People try to spam the listings and sometimes get away with it. I suspect that not only does this virtual tour help verify your listing on a previously unprecedented level, it also provides a great user experience.

How to: From the Google My Business page click the "Add Virtual Tour" link. This will take you to the Google Maps business view screen. Learn more about this program here.

Goal: You should budget for and schedule a Google Trusted Photographer search.


Local SEO audit phase 2: website and landing page optimization

Having a properly optimized site is more important than ever since the Google Pigeon update. Organic signals are now more tied into the local algorithm and having a properly optimized website will help you outrank your competition.

In your Google My Business dashboard you have the ability to link to a website as we looked at above.

Depending on your site structure and other factors, it may make sense to link to either the home page of your website or a landing page that is designated to that location.

Typically a single location business might have all of the pertinent information we are going to discuss below on their home page. On the other hand, if you have multiple locations it generally makes sense to create a page for each location while also reviewing these factors and considerations. With the steps listed below we are talking about whichever page is associated with your Google My Business account listed above. Therefore in this section we will be auditing this information on your own website.

1. Correct crawlable NAP on landing page

Having the proper NAP is just as important off site as it is on site. We don't want to ever send mixed signals to Google so if we keep our accuracy in place then we will be setup for long term success.

Common problems to look over

They use a tracking number. This is a big no no unless done with advanced knowledge, extreme preparation, and a working knowledge of this.

If the phone number is in an image it should have the proper corresponding ALT text.

Goal: The landing page should have NAP on it that matches your Google My Business profile. If your business hours are located within an image keep in mind that Googlebot will be unable to read it.


2. Site structure

According to Phil Rozek from Local Visibility System this is an important item and I agree. He mentions "Is there a page for every specific service (and location and practitioner, if applicable)? Does the homepage form the nucleus of the site, with a ton of useful detail on the page, and plenty of links to relevant subpages? Is the blog on the same domain? 9 times out of 10 people don't get the basics right."

If you are using a landing page for your geographic area, you can optimize your URL structure to accommodate best practices. A good location landing page would be descriptive of your actual physical location. While there are plenty of ways to optimize your landing page I prefer the non spammy result which would be something along the lines of:

http://www.YourSite.com/locations/Denver

This particular optimization item also requires a strong understanding of the site wide URL structure. Don't just change this URL or take this decision lightly. Instead be sure you have the full picture and keyword map defined before jumping to a conclusion. Here is another resource.

Goal: You should have a solid site structure that meets your long term optimization goals which includes the city and state if possible.


3. Business hours

Consumers who find this page organically or through Google My Business may be looking for your hours. If your business hours are located within an image keep in mind that Googlebot will be unable to read it.

Goal: Ensure the customers business days and hours are listed in a crawlable format. These should of course match the business hours on your Google My Business page.


3. Landing page content

Having great content can be difficult at times especially if you have multiple locations. One tip for multiple locations that Mary Bowling has mentioned is that you can ask local operators to write that content. If you have the person running that location write the content it will ensure you get a different flavor each time. You should try for at least 400 words in my experience although I have seen pages with much less still be successful.

When it comes to content, the importance of this cannot be overlooked. One great resource you should read about landing page content is from Miriam Ellis titled " Local Landing Pages: A Guide To Great Implementation In Every Situation."

How to: To check and see if the content's unique I have a couple of different methods. The first would be to do a quick check by copying a couple of sentences or a paragraph at a time and pasting them into a Google search. If other websites come up (and not yours) then there is a major issue that needs to be fixed. You can also used paid services such as Copyscape and Plag Spotter to do checks as well.

Goal: Have at least 400 words of unique (not copied) content on this page that is geared towards creating a great user experience. More is better and it should include the city and state.


4. Check and ensure your landing page is indexed

Assuming this page has been there for a while do a quick Google search for the landing page URL. If your landing page is not showing up, you are likely to find major organic issues such as site architecture or penalties.

How to: Open up a Google Chrome window and switch to Incognito search mode. If you're not logged in under Google Incognito mode it will give you actual results your potential clients might see. Copy the landing page URL for your business and paste it into Google. If your site shows up in the results the page is indexed.

Goal: Your landing page should be indexable and indexed.


5. Landing page meta title tag

The landing page title is an important part for on page optimization. Crafting a perfect landing page title can be a difficult task. Keep in mind that you want to optimize this landing page organically around the business name, keyword, and location including city and state for local SEO. Further reading from Matt Cutts about making a web page for each store location can be found here.

How to: You can check this using the MozBar under the page analysis icon.

Goal: Landing page meta title should contain the City and state.


6. Meta description

The meta description is the snippet that may show in the search results underneath the title. Google can change this information to whatever they want to based on searcher intent. However it's still best practice to have one in place to increase clickthrough rates and possible control the results.

How to: If you use the MozBar it's easy to find the meta description of the page you're analyzing by simply clicking the page analysis button. Changing the meta description will be done through your CMS.

Goal: The meta description should include the business name, city, state, and local phone number.


7. Heading tags

Having a single H1 tag on your landing page is an essential part of organic SEO. If you can include the city, state, and keyword in a non spammy way go for it.

How to: You can also easily check for this using the MozBar page analysis feature.

Goal: You should only have one H1 tag. It should have the city, state in the tag.


8. Driving directions & embedded map on landing page

Driving directions create a great user experience for someone looking for your business. Having them in a text format as well as an embedded map with landmark pictures is a great way to increase the user experience since it also adds an embeddable map.

How to: Go to the classic Google Maps and grab the embed code after searching for your business here.

Goal: Your landing page should have driving directions including an embedded map.


10. Payment information

Having payment information on your website is a great customer resource for your clients. Not only is it a content freebie but having this information may increase conversions. I like to add this into an audit so that people have an expectation and one less question when they call you.

Goal: The forms of payment you accept should be visible on your location landing page.


11. Customer reviews on page in Schema / hReview

Having text reviews from clients on your site is a great way to increase consumer confidence for your website. If you plan on putting them on your landing page it's a good idea to have them in the hReview tag. This structured data tells Google what type of content these are. If it knows they are reviews it will classify them as such. In addition to increasing consumer confidence they can also increase the clickthrough rate by triggering review stars to display in the search results. More clicks = more business for your clients.

How to: I like it when things are easy. That is why I use to the MicroDataGenerator page found here. You can simply fill out these fields click submit and it will shoot out some HTML you can copy and paste into your website.

Goal: You should have at least one written customer testimonials that displays in Schema or hReview.


12. Alt text on landing page

If you have images on your landing page you should have image ALT tags according to best practice. Assuming the image is of the business location an alt tag might be something like "The business name in city state".

How to: You can either view the source or hover over images to view the ALT text.

Goal: Your ALT text should include the city and state if appropriate.


13. Page authority of landing page

Since Google's Pigeon update I have seen many instances where having a stronger landing page has helped my clients be more successful in local SERPs.

How to: You can pull the page authority of your landing page by opening your page in the MozBar. The metric titled "PA" will show you your page authority.

Goal: Record the page authority of the landing page. We can use this as a comparative metric later.


14. NAP in hCard / Schema.org

We already made sure that the NAP should be crawlable on the website. Making sure it's also in hCard will help Google identify the type of data you are posting. In this case it's a local business.

How to: Getting your NAP in Schema can be used by hand coding it or using a generator where you simply input the information. I use the Microdata Generator found here.

Learn more about the hCard microformat.

Goal: Have the crawlable in hCard


15. Load time of landing page

Site speed is an important metric when it comes to having a successful website. The crawl budget to assigned to your website can be correlated with site speed. If your site takes too long to load, Google won't spend as much time on your site. The user will also feel the same way about your website. If it takes too long to load they will bounce. This is a major factor that is mostly overlooked and should never be ignored. You should also make sure to check the load time of the home page and other important pages as well.

How to: There are many ways to measure your site speed. I like to use URI Valet and try to get under 3.5 seconds at the 1.5mbps load time. In addition to this Google Analytics will report on your speed from their point of view as well. Review your whole site and also your landing page for any spikes. Try to identify any issues there. If you want to learn how you can fix this and provide recommendations check out Google's Page Speed Insights.

Here is another other tool you should check out as well.

Goal: As a standard metric I like pages to be under 3-5 seconds at 1.5Mbps load time.


16. KML file on domain name

Although Google does not support KML files, I believe you take a closer step towards perfection by including this information.

How to: To see if there is a KML file submitted check the Sitemaps in Google Webmaster Tools. If you're interested in adding one check out Geo Map Site Generator to create a geo sitemap.

Goal: Have a KML file on your domain name with all of your locations. You can reference it through an XML site map.


17. Domain authority of website

Although this is determined by offsite factors, I like to look at it earlier to get a better understanding of where you stand against your competition.

How to: You can check this by using the Moz Bar. The DA in toolbar stands for Domain Authority.

Goal: Have the highest domain authority over your competitors (naturally obtained). Record this number in the spreadsheet so you can compare against your competition.


18. Footer address

Having an address in your footer can be a good thing in my opinion if you have one location. When you have multiple locations and therefore multiple footer addresses you can create data confusion but more importantly you can create content dilution.

How to: Check this by simply checking the footer of the website.

Goal: If you have one location it's easy and not spammy to include your single location in the footer of every page.


19. Is the content keyword stuffed?

What is keyword stuffing? Keyword stuffing is having too many keywords on your page to the point where they appear unnatural. How many is too many? Too many is too many. If you're trying to rank for the keyword "Blue Nike Shoes" you might only mention that exact phrase one or twice in your content.

Bad Example: "Welcome to my shoe store where we sell Blue Nike Shoes. Our Blue Nike Shoes are the best Blue Nike Shoes you can buy on the internet."

If you were naturally writing about Blue Nike Shoes you might introduce them as shoes but after that you would likely just refer to them as shoes or footwear.

about keyword stuffing or irrelevant keywords.

Goal: No content should be keyword stuffed.


20. Landing page easily readable by search engines

Having a website that looks cool and is made with Flash is one thing. It's also a big no no for the search engines. The content on your page should be crawlable.

How to: One easy trick to do this is to use the select all Edit> Select All. This will highlight everything on the page including all crawlable text. It will show up in blue.

Goal: Make sure your important text is not in images. Ensure the page is not all in Flash and that the text is crawlable.


21. Is your site design mobile friendly?

Google representatives have mentioned on more than one occasion that a site which is not mobile friendly may have reduced visibility in the search results. So you need to have a mobile friendly design. I prefer responsive designs because the content will resize properly depending on the type of device you're on.

Learn more about responsive website designs.

How to: To check and see if the website is mobile friendly you can pull up the website and the landing page on your mobile device or tablet. You can also check out tools like Browserstack which will show you your website and how it will look in different browsers and sizes.

Goal: Ensure your entire website is mobile friendly, preferably through the use of a responsive design.


22. Is your site content mobile friendly?

If you have a mobile site on a separate domain like m.mysite.com you need to be sure that the content is not being shown as duplicate content. If you are using a responsive design this is not an issue.This should be reviewed closely. Although Google offers support for items such as canonical tags for duplicate content versions, I have found that Google is imperfect and when you try to make it decide, you open yourself up for more errors.

Goal: Run a duplicate content check to make sure the full site and mobile site are not both indexed.


23. WHOIS information review

This is something more out of habit on my side. If the owner of the domain name has opted to not have privacy protection they are required to list the contact name, address, phone number, and email address of the domain owner. I suggest keeping this information public and having it match your main location. Whether it's used or not I can't think of a better way of showing that you own the business.

How to: There are plenty of services out there that allow you to check the WHOIS information. GoDaddy even has a free WHOIS check here.

Goal: Check the WHOIS Information and ensure the NAP matches the main location.


24. Google Analytics

Having Google Analytics installed is essential. If your client does not have this installed it will be rather difficult to track problems, traffic changes, conversions, and other important visitor information.

How to: Check and see if this is installed by using your browsers view page source option. Search for Analytics and see if the tracking code snippet. You can learn more about using Google Analytics here.

Goal: Make sure Google Analytics is installed.


25. Google & Bing Webmaster Tools

Google Webmaster Tools provides a lot of useful information. This is where Google will tell you all sorts of useful information including whether or not your site is penalized, if your pages are indexed, and any crawl errors.

How to: Log in to Google and Bing Webmaster Tools and see if your accounts are set up properly.

Goal: Ensure Google Webmaster and Bing Webmaster Tools are configured properly.


Local SEO audit phase 3: citations audit

When it comes to citations, more does not mean better. Let me say that another way. Better citations will always be more important than more citations from lower quality directories. I cannot stress this enough. Earlier in my local SEO career I made this mistake with clients. If you take the route of wanting the most citations you must do your own extensive due diligence on the impacts this could have long term for your clients. For the most part, if you've never heard of it, and it's easy to get, you should conduct a link analysis on the domain. Many citations also come with a link, some of which are follow links. You can easily undo a lot of good work by adding hundreds or thousands of low quality directories with links.

Again I cannot stress the importance of this enough. Please be careful and ethical in your approach when getting citations.

What is a citation?

In case you don't know a citation is, a citation is simply the listing of your business name, address, and phone number (or NAP as we call it in the industry) on the web. For example, if I have a Yelp profile it will include my business name, address, and phone number along with other pertinent business information. This is a structured citation as its a sanctioned business directory that provides this information. There are also what we call unstructured citations. Unstructured citations are mentions of your NAP on websites that are not necessarily a business directory. For example if my business was mentioned in my local newspaper they might also include my name, address, and phone number on their online article.

Where did these come from?

Citations may have been created by you, your company, or out of thin air from a phone book listing or something else. There are companies out there that sell this business listing information to other directories and feed them information. These are called data aggregators. It's essential that your listings are correct with data aggregators. This will ensure that the listings are as correct as possible from the top down.

Are they that important?

Citations have always played a heavy part in the local SEO algorithm although my personal opinion is that this has been reduced post Pigeon. That being said, and even if your opinion differs on this subject, you should not disagree that your business visibility online should be the least imperfect. If you want to provide the best customer and search engine experience, it only makes sense to have a clean citation profile. Another reason for this, which we will talk about later, is reviews. If your business has duplicate listings it won't only hurt your local SEO rankings, your customers might not find the right listing and therefore leave, or see incorrect information or outdated reviews.

The overall goal of the citation audit

To ensure that you are listed, that there are no duplicates, and that the information is 100% correct on the top 50 citation sources and all of the data aggregators.

Thanks to my friends over at Whitespark, we have been allowed to publish their list of the top 50 citations and I have included them in the spreadsheet. Keep in mind that you also want to search with an iOS device and ensure your business is listed in Apple Maps.


1. Check data aggregators

According to Moz.com, a data aggregator is "A company that maintains and supplies the underlying business database for local search directories." In other words, they are the data holders that submit your business information to other directories. The main data aggregators in the United States are Infogroup, Localeze, Acxiom, and Factual.

How to: One way to get a snapshot of this and fix it is to use the Moz Local platform. For $84 a year you get listed in the top 5 data aggregators. If you did it on your own, it would cost a lot more than that. Read about it and sign up for it here. Signing up for Moz Local can help alleviate some of the data aggregator stress and save you some money from doing it yourself. You can also avoid the embarrassment of how I look below.

Goal: Make sure your business is listed correctly without duplicates on the main data aggregators including Infogroup, Localeze, Acxiom, and Factual in the US.


2. Check the top 50 citations

This is where the hard work gets real. Like I mentioned earlier more is not better. Using the list in the provided spreadsheet you should ensure that you are listed 100% consistent without duplicates in the top 50 citation sources. Any problems identified here are a priority to be fixed. In addition to being listed correctly without duplicates you should also make sure they are 100% filled out. Having a complete profile always provides a better user experience.

Click here to view the list of Top 50 Citation Sources provided by Whitespark.

Goal: Ensure the top 50 citations for your industry are live, correct, and don't have duplicates.


3. Identify new high quality citation sources

If your top 50 citations are in order there may still be a few more you might be interested in depending on your industry. Typically a safe bet for everyone is to set a long term goal of unstructured citations or mentions in local and major newspapers and publications. Additionally see what niche sites are available for your type of business.

Some examples include:

Attorneys: Lawyers.com, Martindale.com, Avvo.com

Restaurants: Yelp.com, TripAdvisor.com, OpenTable.com

If you want to learn more you can read my other article to learn how you can find citations like an agency.

If you're looking for a good list of niche citations check out Brightlocal's newly published Best Niche Citation Sites for 41 Business Categories here on Brightlocal.

Goal: Get high quality unstructured citations when possible.


4. Verify you're on Apple Maps

With the user base that Apple has it's important to make sure you're listed on Apple Maps. Previously this was a pain in the rear if you weren't already listed. However in October 2014 Apple announced Apple Maps Connect.

How to: Visit Apple Maps Connect and log in using an Apple ID. This is a brand new product but you should be able to manage your listings here.

Goal: Make sure your NAP is correct on Apple Maps.


Local SEO audit phase 4: organic penalty analysis and link audit

There are a lot of SEO companies out there. If your client has ever had a different website it's even more important to conduct a link analysis on their website. One of your responsibilities as someone performing an audit is to ensure that you can keep your client out of a penalty short term and long term. Even if you didn't do the work yourself, you need to identify any potential problems and prepare them so they can make adequate business decisions. Google has been and can be pretty brutal with their penalties, and recovery is not a walk in the park.

Since the Pigeon Update, Google has put more weight on high quality links and overall authority into the local algorithm. I hesitate to say this knowing that some will just want to get more links. Don't do this. Don't even think about this. Links should be earned through content you produce or news mentions you get. Additionally, keep in mind that one strong link is better than 100,000 crappy links...by far.

This is also super exciting. If you notice that your competitor is beating your rankings with spammy links you can rest assured that they'll get caught. In addition to this you can develop a strategy that kicks their spam to the curb without the risk of future penalty. It's a win win, and a step towards long-term lasting results.

The problem with a link audit

Experience is key when conducting a link audit. For those of us that have reviewed hundreds of thousands of links by hand it becomes a bit easier. Tools also make this job easier, but tools are also imperfect. I could also write a 10,000+ word article on link auditing based on my personal imperfect experience.


1. Ensure there is not a manual web spam action

Having a manual web spam action will ruin your day (or year). When you receive a penalty, Google can decide to take all or some of your website out of its index and hide pages from users. This will become your top priority to fix if it exists.

How to: If your site is verified in Google Webmaster Tools (which it should be) you can simply click on the manual actions tab. If nothing shows up you're home free in this department.

Goal: Check Google Webmaster Tools and make sure there are no penalties under the manual actions tab. If there are this will become your top priority to fix this.


2. Check for algorithmic penalties

If you see any sharp drop in organic traffic in Google Analytics then there is a problem. Identifying algorithmic penalties can be a bit tricky, but, with proper work, you can identify and remove these. Check out Moz's complete guide to algorithm changes.

How to: I prefer to take the easy route when it comes to addressing algorithmic penalties. If you have access to Google Analytics and have had enough website traffic to spot trends you can use a tool that will align your traffic numbers with the dates of known algorithmic updates. The Panguin tool makes this easy to do.

Goal: Try your best to determine if there are any algorithmic penalties and what you can do to fix them.


3. How many links do the site and page have?

Knowing the number of links can help us determine if there will be a potential problem in the future. If you have way more links than your competitors its worth checking them out in detail. If you find that they are all low quality or spammy then you have a potential problem on your hands and you need to get this taken care of as soon as possible. Don't wait until it's too late.

How to: The easiest way to check this is just using the MozBar.

Goal: Record the number of links to your landing page and root domain and compare it to your competitors.


4. Anchor text of links

Anchor text is the text that appears highlighted in a hypertext link and that can be clicked to open the target web page. For example, on an online store you might click "Checkout" which will lead you to their checkout page. If that link was clickable text (not a button) that would anchor text. Your anchor text distribution is very important to organic SEO (and now local SEO) as it correlates heavily with penalties which we are trying to avoid.

In some ways, this relates more to local businesses. If someone is writing about your or writing a review about your local business they would likely naturally make the anchor text your business name or your business name with the location. For example if you ran a Quiznos in Denver you might be linked via the anchor text "This Denver Quiznos Location". This right here is natural because the writer decided to link to you in a way that is descriptive of your business name and location in their own words. However if Quiznos wanted to rank for "Sandwich Shop" and every link on the internet pointing back to them said "Sandwich Shop", you can see how unnatural and unfeasible that would be. Unless someone was creating those intentionally they would not exist as the only keyword. Instead, you would see many variants of that like "sandwich shop in Denver" or "Denver sub shop" or "my favorite subs" or the brand name.

With anchor text, having too much of anything can be a bad thing. Now, here is the official answer to anchor text distribution and how much is too much: It depends on your industry.

You can read more about anchor text in the Anchor Text Distribution Study: Powered By Search Anchor Text Study

Goal: Your link profile should appear natural. If you're a "Personal Injury Attorney" and 10% of your anchor text is a money phrase, that is bad. Brand anchor text should always occur more often than commercial keywords.


5. Link velocity

Another idea that gets people in trouble is link velocity. Some site owners who have not competed in the web landscape suddenly want to rank #1 and notice their competitors have a bunch of links. Thinking it's the right thing to do (it's not) they add a bunch overnight. This is the velocity that links are obtained to your website. If you go viral and do something newsworthy this will be natural and be substantiated by strong links from high quality sites. On the other side, if you just add links to get them, and they're all low quality, you are going to be at a much higher risk of penalty. Again, don't do it. Earn links and sleep good at night. If they had spikes in the past you can review these and remove as necessary.

Goal: Make sure that the link velocity seems natural. Review any spikes for problems.


6. Does a disavow exist?

Your clients site will have bad links. Maybe it's 1 or 2 or maybe it's 200,000. Either way we need to figure this out. If the bad links exist, the disavow file needs to be uploaded. This will be your saving grace if an algorithmic penalty exists and there is a Penguin update which takes your disavow into consideration. The most recent Penguin update just took place in October 2014 after over a year without one.

Goal: Ensure the disavow file is updated monthly to include any bad links whether they were intentional or from negative SEO.


Local SEO audit phase 5: reviews

Reviews are one of the most overlooked problems with local businesses. It's not that they're not concerned with them but rather that they've tried to fix them in the past but ran into resistance. Make no mistake about it, reviews are hard to get.

By |November 5th, 2014|MOZ|4 Comments

Enabling http Without Sacrificing Your Web Performance

7557181168_91f4af2d99_z

Posted by Zoompf

A fast website is a crucial component of a great user experience. So much so that a few years ago Google announced that they were using page load time as a factor in search engine rankings. More recently, Google also announced that they would be favoring websites that use Transport Layer Security (TLS) in its search rankings. TLS encrypts a website's traffic preventing other entities from monitoring its communications. However, adding this protection introduces more complexity to your website and how it communicates with your visitors, potentially slowing things down and negatively affecting the user experience. In this blog post, I will show you how to implement TLS on your website while keeping it fast and responsive.

Conventions and disclaimers

Before we start, a quick note on naming. Beside TLS, you may have also heard the term SSL. SSL was the original encrypted connection protocol created by Netscape in the mid '90s. TLS is the industry standard protocol that grew out of SSL and has continued to evolve and improve while development of SSL has ceased. In the past, SSL and TLS have been largely interchangeable terms. However, the final version of SSL, SSLv3, was recently found to be not secure. All versions of SSL now have known security problems and no one should be using any version of SSL. To avoid confusion, we will not mention SSL again, and will talk exclusively about TLS.

Additionally, while TLS does help protect your websites's visitors from eavesdroppers, it does not magically make your site "secure" from security flaws like cross-site scripting or SQL injection. If you store personal data or conduct commerce through your website, you should explore more rigorous web application security options.

Finally, any type of guide, tutorial, or how-to post on security is highly time sensitive. Attackers are constantly evolving and new attacks are always discovered. Advice about optimizing TLS performance from even 2 years ago, such as using RC4, would today leave your site unsecured. You should always maintain vigilance and make sure you trust your sources.

TLS Areas that need TLC

There are 2 areas of TLS that can harbor performance problems:

  1. Encrypting the data. Data sent back and forth between visiting web browsers and your web server must be encrypted and decrypted. If not configured properly, your page load times can become much slower than unencrypted traffic.
  2. Establishing a secure connection. There are several steps that must occur before a browser establishes a secured connection to your website: identities must be confirmed, algorithms must be selected, and keys must be exchanged. This is known as the TLS Handshake, and it can have a significant impact on your site performance.

We need to give each of these areas some Tender Loving Care (TLC) to optimize for performance. Let's discuss each in detail.

Optimizing Data Encryption

When you use TLS, you are adding another 2 steps to the process of how a browser and web server communicate. The sender has to work to encrypt the data before transmitting it, and the receiver has to decrypt the data before it can process it. Since these operations are occurring on all of your web traffic all of the time, you want this exchange to be as efficient as possible.

There are a large number of ciphers that can be used to perform this encryption/decryption. Some, such as 3DES, were originally designed to be implemented in hardware and can perform slowly when implemented in software on your computer or phone's browser. Others, such as AES, are so popular that CPU makers like Intel have added dedicated instructions to their chips to make them run faster.

A decade ago, TLS data encryption added significant overhead. Today, Moore's law and dedicated support for certain ciphers in CPUs has essentially eliminated this overhead, provided you select the right cipher. The consensus from both security engineers and administrators that run large TLS websites is to use AES, with 128 bit keys. We can see from the chart below that AES running on a CPU that supports built-in AES instructions (denoted by the label AES-NI) is by far the fastest cipher you can use.

Specifying which cipher and options to use can be quite challenging and intimidating. Luckily, Mozilla maintains an excellent page with example cipher configurations to ensure fast and secure connections. These example configurations work with all browsers, and they default to using the faster algorithms like AES. These are constantly updated when new security threats come out, and I highly suggest following their guidance.

As mentioned, to get the most out of AES, your web server will need to use a CPU that supports the dedicated AES instructions known as AES-NI. Most server-grade CPUs made in the last 5 years, such as Intel's Xeon line, support AES-NI. However, older virtualized servers and cloud servers often don't support these instructions. Amazon's M1 class of EC2 instances does not support AES-NI, whereas Amazon's current class of M3 instances do. If you are using a hosted service, check with your hosting provider about what TLS options they support and whether your hosting computer supports AES-NI.

In short, by configuring your web server to use AES ciphers and terminating your TLS connections on a machine with CPU with support for AES-NI instructions, you can effectively mitigate the performance penalty of data encryption.

Optimization Checklist

  • Enable AES as your preferred cipher, following Mozilla's guidelines.
  • Verify that web server is running on a system with a CPU that supports AES-NI instructions.

Optimizing the TLS Handshake

The TLS handshake is the process that the browser and server follow to decide how to communicate and create the secured connection. Some of the things that happen during the handshake are:

  • Confirming the identity of the server, and possibly the client
  • Telling each other what ciphers, signatures, and other options each party supports, and agreeing on which to use
  • Creating and exchanging keys to be used later during data encryption

The TLS handshake is shown in this rather technical-looking diagram:

handshake

Don't worry. While there are a lot of details in the diagram, the takeaway is that a full TLS handshake involves 2 round trips between the client and the server. Because of the difference between latency and bandwidth, a faster internet connection doesn't make these round trips any faster. This handshake will typically take between 250 milliseconds to half a second, but it can take longer.

At first, a half-second might not sound like a lot of time. The primary performance problem with the TLS handshake is not how long it takes, it is when the handshake happens. Since TLS handshakes are part of creating the secure connection, they have to happen before any data can be exchanged. Look at the waterfall diagram below: (if you need help, check out how to read a webpage waterfall chart.)

TLS head of line

The TLS handshake, shown in purple for each step here, is adding 750 ms of delay to the time it takes to get the initial HTML page. In this example, getting the HTML page over TLS takes twice as long as getting the same page over an unencrypted connection! Worse, the browser can't do anything else until it gets this initial HTML page. It cannot be downloading other resources in parallel, like CSS files or images, because it hasn't gotten that initial HTML page telling them about the other resources. This is true with every secured webpage you visit: The browser is blocked from getting that first HTML response. Unfortunately, the TLS handshake increases the amount of time where the browser can't do anything, slowing down your site performance.

Also, remember the TLS handshakes happen at the start of every new HTTP connection. Since browsers download resources in parallel, this means that a visiting browser will create multiple TLS connections and have to wait for multiple handshakes to complete, even with visiting a single page!

Unfortunately, there is not a lot of extra or unnecessary data that we can optimize during the TLS handshake. The primary aspect we can optimize is the "confirming the identity of the server" step. To do this, the browser looks at something called the certificate chain.

The certificate chain

When you visit http://www.example.com, how do you know you are really talking to www.example.com? TLS certificates solve this problem. You receive a certificate telling your browser "yes, this is www.example.com. trust me." But how do you know you can trust the certificate the server sent?

This is where the certificate chain comes in. Your certificate will be digitally signed by some other entity's certificate, which essentially says "example.com is cool, I vouch for it, here is my certificate." This is called an intermediate certificate. Browsers come with a built in a list of a thousand or so certificates that they trust. If the browser happens to trust this intermediate certificate, we are done. However, it's possible the browser doesn't trust your website's certificate, or the intermediate certificate.

What happens then? Simple! The browser will then look to see who signed the intermediate certificate, and who signed that one, and so on... Basically the browser will "walk" up this chain of certificates, seeing who is vouching for who, until it finds a certificate of someone it trusts from that built-in list mentioned above.

The certificate chain looks something like this:

cert-chain-app.zoompf.com

Here we see a certificate for my website app.zoompf.com. My certificate was signed by the certificate by "DigiCert Secure Server CA." The browser does not trust this certificate since it's not in its pre-built list. However, the "DigiCert Secure Server CA" certificate was in turn signed by the "DigiCert Global Root CA" certificate, which is in that list and is thus trusted. So in this case, my certificate chain length is 3.

You can optimize your site performance by making this certificate chain as short as possible, since validating each certificate in the chain takes extra time. Additional certificates also means more data that has to be exchanged while establishing the secured connection. The browser might even need to make additional requests to download other immediate certificates, or to check that each certificate in the chain is still valid and hasn't been revoked.

When shopping for an TLS certificate, ask the vendor:

  • What certificate will be used to sign your certificate, and how long will the certificate chain be?
  • Will they include their intermediate certificate bundled with your certificate, so the browser won't have to wait downloading other certificates while walking up the certificate chain?
  • Do they support OCSP stapling, to reduce the time needed to check for revoked certificates?

I recommend purchasing your certificate from a large, well known vendor. These tend to offer better support and features like OCSP. They are also more likely to have their root certificates trusted by the browser and thus have a shorter certificate chain length. You can learn more about how to test your certificate chain here.

Optimization checklist

  1. Minimize the length of your certificate chain.
  2. Verify that any immediate certificates are bundled with your certificate.
  3. Get a certificate that supports OCSP, if possible.

Avoiding full TLS handshakes

At its heart, the TLS handshake is about the client and the server verifying each other, agreeing on a common set of ciphers and security options, and then continuing the conversation using those options. It seems silly that a client and a server that have recently communicated before need to go through this full process over and over again. Imagine this scenario: You are visiting a blog like this one over TLS. Multiple TLS connections with multiple handshakes were made to download all the content. In a few minutes, you click a link to read a different page on this site, which causes your browser to do multiple TLS handshakes all over again.

This is where TLS session resumption comes in. Basically, TLS session resumption allows a client to say, "Hey server, we communicated a little while ago and did so using the following TLS options... Is it OK to start talking again using those same options?" This is a huge improvement on performance. A full TLS handshake requires 2 round trips to create the secure connection. TLS session resumptions allows us to do it with 1 round trip.

The great thing about session resumption is that it is basically a free short-cut. When the client asks the server, "can we use these previously agreed upon settings?", it does so as part of the first round trip in setting up a full TLS handshake. If the server agrees, great, the short cut is followed and no further handshaking is necessary. If, for whatever reason, the server doesn't agree to the session resumption request, the TLS handshake continues as normal and completes in 2 round trips. There's no reason not to use session resumption.

There are 2 different mechanisms to implement TLS resumption. The first is Session Identifiers and the second is Session Tickets. They both do the same thing. The difference between them is primarily which side has to keep track of the previously agreed upon options. All web browsers support both, but some web servers, like Microsoft's IIS, only support session identifiers. Session identifiers are a slightly older mechanism, and can potentially expose your site to Denial of Service attacks. Enabling either session identifiers or sessions tickets is done via your web server configuration, and is quite easy. Consult with your administrator about getting these options enabled.

Optimization checklist

  1. Enable TLS resumption on your web servers.
  2. If possible, avoid using session identifiers to reduce your exposure to Denial of Service attacks.

Other TLS Options

There are several other TLS options and nuances we are glossing over: What asymmetric algorithm should you use? What key exchange protocol should you use? What key size should you use for your symmetric cipher? Should you be using perfect forward secrecy? These are important decisions from a security perspective, and everyone's needs are different. From a performance perspective, these are largely moot. It is best to leave these choices to whomever manages your server, or to follow advice from Mozilla on the page linked above.

Minimizing TLS handshakes altogether

As we have seen, the TLS handshakes, while necessary, can have an impact on your performance:

  • They can delay the download of critical responses like the initial HTML page.
  • They can happen multiple times on a single page.
  • There isn't much we can do to optimize them.

While session resumption can cut the delay of a TLS handshake in half, it is still best to avoid TLS handshakes altogether. You can do this by minimizing how many HTTP connections a browser makes when visiting your website. Luckily, many traditional front-end performance optimizations that you should be doing anyway can help. This makes front-end performance optimizations even more important on sites secured with TLS. Let's focus on 4 optimizations that are particularly relevant for sites using TLS.

1. Persistent connections

Persistent connections allow HTTP to make multiple requests over a single HTTP connection. Persistent connections allow the browser to load the page faster because it can make requests more quickly. But it can also cut down on the number of TLS handshakes. Consider this waterfall, which we looked at before:

TLS head of line

See how virtually every HTTP request has a purple section? This purple section is the TLS handshake. Why does it keep happening? Because the web server is explicitly closing the HTTP connection, and thus the underlying TLS connection, with every response. We can see this with the Connection: close response header, as shown below:

ssl-connection-close

This is terrible for performance in general, but especially bad for a site using TLS. Your website should be using persistent connections.

2. Domain sharding

Domain sharding is a technique to trick a visiting browser into downloading resources from your website more quickly. It works by having a single web server with different hostnames. For example, your site might be named example.com, but configured to resolve the names static1.example.com and static2.example.com to the same server. Since browsers allow only a limited number of HTTP connections to a single hostname at the same time, using multiple hostnames trick the browser into downloading more content in parallel.

The problem with domain sharding is that the browser doesn't know that example.com, static1.example.com, and static2.example.com are all the same server. It will make new HTTP connections to each hostname, and have to do a full TLS handshake each time. In our example, we are potentially doing 3 times the number of TLS handshakes because of our sharded hostnames. Additionally, session resumption information for connections on one hostname cannot be used by connections to another hostname, even though under the covers all these names refer to the same server.

The net result is that increased number of TLS handshakes caused by domain sharding may offset any advantage gained from downloading more content in parallel. In fact, sharding a TLS-protected website might actually make it slower. This is especially true if you follow the next two pieces of advice, which will reduce the number of items that need to be requested at all.

3. Combining CSS and JavaScript files

Combining multiple CSS or JavaScript files into one or two primary files is a huge front-end performance optimization. Browsers can download one 100 KB file faster than 10 10-KB files. The advantage for TLS sites is that if you are making fewer requests, you are less likely to need additional HTTP connections that will require a resumed or full TLS handshake.

4. Caching resources

The fastest request is one the browser doesn't have to make. Caching might be the best front-end performance optimization you can make. If I just visited your site, and I'm looking at a second page, there is no reason to download your logo a second time. If you don't use caching, the browser must check with your website if it is OK to use logo image it has previously downloaded. This is called a conditional request, and it's bad for performance. Because of the difference between bandwidth and latency, even if you don't actually download anything from the server, simple sending a request to ask if it is OK to use a logo takes almost as long as just downloading the logo again.

Conditional requests are bad for TLS. You are forcing the browser to create more HTTP connections, and thus perform more TLS handshakes, just to check if the content is still valid. Caching your static resources like images, CSS and JavaScript will have a big benefit and can prevent these additional connections.

Optimization checklist

  • Enable persistent connections. Ensure your application or CMS is not prematurely closing HTTP connections.
  • Use tools like WebPageTest to see if domain sharding will actually improve performance for your TLS enabled website.
  • Combine multiple JavaScrpt and CSS files into bundles where appropriate.
  • Cache your static resources for 5 minutes, even if you don't have a file versioning system in place.
  • If you have the infrastructure and processes in place, use far-future caching with a file versioning system that changes the URLs of your resources when they change.
  • Test your site to ensure you are properly implementing front-end performance optimizations.

Summary

Google is now favoring websites that are secured using TLS in search engine rankings. TLS can have an impact on performance and this article has shown you the steps you can take to minimize the impact.

The data encryption overhead for secure connections is largely a problem of the past, thanks to faster CPUs with built-in support for AES cipher operations. The TLS handshake can be optimized by keeping your certificate chain short by purchasing your certificate from a large, well known vendor whose signing certificates on the trusted list instead of web browser. You can speed up subsequent TLS handshakes by enabling session resumption on your server. You can avoid many TLS handshakes all together by implementing common front-end performance optimizations like persistent connections and caching, and avoiding tricks like domain sharding. You can also use Zoompf's free performance report to ensure your website is using AES and is properly implementing the suggested front-end performance optimizations.

In our next blog post we will discuss with intersection of security and performance that Google is creating with its new SPDY protocol.

If you'd like to stay on top of your website performance, consider joining the free Zoompf Alerts beta to automatically scan your website every day for the common causes of slow website performance.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 4th, 2014|MOZ|0 Comments

Google’s Physical Web and its Impact on Search

Posted by Tom-Anthony

In early October, Google announced a new project called " The Physical Web," which they explain like this:

The Physical Web is an approach to unleash the core superpower of the web: interaction on demand. People should be able to walk up to any smart device - a vending machine, a poster, a toy, a bus stop, a rental car - and not have to download an app first. Everything should be just a tap away.

At the moment this is an experimental project which is designed to promote establishing an open standard by which this mechanism could work. The two key elements of this initiative are:

URLs: The project proposes that all 'smart devices' should advertise a URL by which you can interact with that device. The device broadcasts its URL to anyone in the vicinity, who can detect it via their smartphone (with the eventual goal being this functionality is built into the smart phone operating systems rather than needing third-party apps).

Beacons: Not well known until Apple recently jumped on the bandwagon announcing iBeacons, beacon technology has been around for a couple of years now. Using a streamlined sibling of Bluetooth, called Bluetooth Low Energy (no pairing, range of ~70 metres / ~230 feet) it allows smartphones to detect the presence of nearby beacons and their approximate distance. Until now they've mostly been used to 'hyper-local' location based applications (check this blog post of mine for some thoughts on how this might impact SEO).

The project proposes adapting and augmenting the signal that Beacons send out to include a URL by which nearby users might interact with a smart device.

This post is about looking to the future at ways this could potentially impact search. It isn't likely that any serious impact will happen within the next 18 months, and it is hard to predict exactly how things will pan out, but this post is designed to prompt you to think about things proactively.

Usage examples

To help wrap your head around this, lets look at a few examples of possible uses:

Bus times: This is one of the examples Google gives, where you walk up to a bus stop and on detecting the smart device embedded into the stop your phone allows you to pull the latest bus times and travel info.

Item finder: Imagine when you go to the store looking for a specific item. You could pull out your phone and check stock of the item, as well as being directed to the specific part of the store where you can find it.

Check in: Combined with using URLs that are only accessible on local wifi / intranet, you could make a flexible and consistent check in mechanism for people in a variety of situations.

I'm sure there are many many more applications that are yet to be thought up. One thing to notice is that there is no reason you can't bookmark these advertised URLs and use them elsewhere, so you can't be sure that someone accessing the URL is actually by the device in question. You can get some of the way there by using URLs that are only accessible within a certain network, but that isn't going to be a general solution.

Also, note that these URLs don't need to be constrained to just website URLs; they could just as well be deep links into apps which you might have installed.

Parallels to the web and ranking

There are some obvious parallels to the web (which is likely why Google named it the way they did). There will be many smart devices which will map to URLs which anyone can go to. A corollary of this is that there will be similar issues to those we see in search engines today. Google already identified one such issue—ranking—on the page for the project:

At first, the nearby smart devices will be small, but if we're successful, there will be many to choose from and that raises an important UX issue. This is where ranking comes in. Today, we are perfectly happy typing "tennis" into a search engine and getting millions of results back, we trust that the first 10 are the best ones. The same applies here. The phone agent can sort by both signal strength as well as personal preference and history, among many other possible factors. Clearly there is lots of work to be done here.

So there is immediately a parallel between with Google's role on the world wide web and their potential role on this new physical web; there is a suggestion here that someone needs to rank beacons if they become so numerous that our phones or wearable devices are often picking up a variety of beacons.

Google proposes proximity as the primary measure of ranking, but the proximity range of BLE technology is very imprecise, so I imagine in dense urban areas that just using proximity won't be sufficient. Furthermore, given the beacons are cheap (in bulk, $5 per piece will get you standalone beacons with a year-long battery) I imagine there could be "smart device spam."

At that point, you need some sort of ranking mechanism and that will inevitably lead to people trying to optimise (be it manipulative or a more white-hat approach).
However, I don't think that will be the sole impact on search. There are several other possible outcomes.

Further impacts on the search industry

1. Locating out-of-range smart devices

Imagine that these smart devices became fairly widespread and were constantly advertising information to anyone nearby with a smart devices. I imagine, in a similar vein to schema.org actions which provide a standard way for websites to describe what they enable someone to do ("affordances," for the academics), we could establish similar semantic standards for smart devices enabling them to advertise what services/goods they provide.

Now imagine you are looking for a specific product or service, which you want as quickly as possible (e.g "I need to pick up a charger for my phone," or "I need to charge my phone on the move"). You could imagine that Google or some other search engine will have mapped these smart devices. If the above section was about "ranking," then this is about "indexing."

You could even imagine they could keep track of what is in stock at each of these places, enabling "environment-aware" searches. How might this work? Users in the vicinity whose devices have picked up the beacons, and read their (standardised) list of services could then record this into Google's index. It sounds like a strange paradigm, but it is exactly how Google's app indexing methodology works.

2. Added context

Context is becoming increasingly important for all searches that we do. Beyond your search phrase, Google look at what device you are on, where you are, what you have recently searched for, who you know, and quite a bit more. It makes our search experiences significantly better, and we should expect that they are going to continue to try to refine their understanding of our context ever more.

It is not hard to see that knowing what beacons people are near adds various facets of context. It can help refine location even further, giving indications to the environment you are in, what you are doing, and even what you might be looking for.

3. Passive searches

I've spoken a little bit about passive searches before; this is when Google runs searches for you based entirely off your context with no explicit search. Google Now is currently the embodiment of this technology, but I expect we'll see it become more and more

I believe could even see see a more explicit element of this become a reality, with the rise of conversational search. Conversational search is already at a point where a search queries can have persistent aspects ("How old is Tom Cruise?", then "How tall is he?" - the pronoun 'he' refers back to previous search). I expect we'll see this expand more into multi-stage searches ("Sushi restaurant within 10 minutes of here.", and then "Just those with 4 stars or more").

So, I could easily imagine that these elements combine with "environment-aware" searches (whether they are powered in the fashion I described above or not) to enable multi-stage searches that result in explicit passive searches. For example, "nearby shops with iPhone 6 cables in stock," to which Google fails to find a suitable result ("there are no suitable shops nearby") and you might then answer "let me know when there is."

Wrap up

It seems certain that embedded smart devices of some sort are coming, and this project from Google looks like a strong candidate to establish a standard. With the rise of smart devices, whichever form they end up taking and standard they end up using, it is certain this is going to impact the way people interact with their environments and use their smart phones and wearables.

It is hard to believe this won't also have a heavy impact upon marketing and business. What remains less clear is the scale of impact that this will have on SEO. Hopefully this post has got your brain going a bit so as and industry, we can start to prepare ourselves for the rise of smart devices.

I'd love to hear in the comments what other ideas people have and how you guys think this stuff might affect us.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 3rd, 2014|MOZ|0 Comments

What SEOs Need to Know About Topic Modeling & Semantic Connectivity – Whiteboard Friday

Posted by randfish

Search engines, especially Google, have gotten remarkably good at understanding searchers' intent—what we mean to search for, even if that's not exactly what we search for. How in the world do they do this? It's incredibly complex, but in today's Whiteboard Friday, Rand covers the basics—what we all need to know about how entities are connected in search.

For reference, here's a still of this week's whiteboard!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're talking topic modeling and semantic connectivity. Those words might sound big and confusing, but, in fact, they are important to understanding the operations of search engines, and they have some direct influence on things that we might do as SEOs, hence our need to understand them.

Now, I'm going to make a caveat here. I am not an expert in this topic. I have not taken the required math classes, stats classes, programming classes to truly understand this topic in a way that I would feel extremely comfortable explaining. However, even at the surface level of understanding, I feel like I can give some compelling information that hopefully you all and myself included can go research some more about. We're certainly investigating a lot of topic modeling opportunities and possibilities here at Moz. We've done so in the past, and we're revisiting that again for some future tools, so the topic is fresh on my mind.

So here's the basic concept. The idea is that search engines are smarter than just knowing that a word, a phrase that someone searches for, like "Super Mario Brothers," is only supposed to bring back results that have exactly the words "Super Mario Brothers," that perfect phrase in the title and in the headline and in the document itself. That's still an SEO best practice because you're trying to serve visitors who have that search query. But search engines are actually a lot smarter than this.

One of my favorite examples is how intelligent Google has gotten around movie topics. So try, for example, searching for "That movie where the guy is called The Dude," and you will see that Google properly returns "The Big Lebowski" in the first ranking position. How do they know that? Well, they've essentially connected up "movie," "The Dude," and said, "Aha, those things are most closely related to 'The Big Lebowski. That's what the intent of the searcher is. That's the document that we're going to return, not a document that happens to have 'That movie about the guy named 'The Dude' in the title, exactly those words.'"

Here's another example. So this is Super Mario Brothers, and Super Mario Brothers might be connected to a lot of other terms and phrases. So a search engine might understand that Super Mario Brothers is a little bit more semantically connected to Mario than it is to Luigi, then to Nintendo and then Bowser, the jumping dragon guy, turtle with spikes on his back -- I'm not sure exactly what he is -- and Princess Peach.

As you go down here, the search engine might actually have a topic modeling algorithm, something like latent semantic indexing, which was an early model, or a later model like latent Dirichlet allocation, which is a somewhat later model, or even predictive latent Dirichlet allocation, which is an even later model. Model's not particularly important, especially for our purposes.

What is important is to know that there's probably some scoring going on. A search engine -- Google, Bing -- can understand that some of these words are more connected to Super Mario Brothers than others, and it can do the reverse. They can say Super Mario Brothers is somewhat connected to video games and very not connected to cat food. So if we find a page that happens to have the title element of Super Mario Brothers, but most of the on-page content seems to be about cat food, well, maybe we shouldn't rank that even if it has lots of incoming links with anchor text saying "Super Mario Brothers" or a very high page rank or domain authority or those kinds of things.

So search engines, Google, in particular, has gotten very, very smart about this connectivity stuff and this topic modeling post-Hummingbird. Hummingbird, of course, being the algorithm update from last fall that changed a lot of how they can interpret words and phrases.

So knowing that Google and Bing can calculate this relative connectivity, connectivity between the words and phrases and topics, we want to know how are they doing this. That answer is actually extremely broad. So that could come from co-occurrence in web documents. Sorry for turning my back on the camera. I know I'm supposed to move like this, but I just had to do a little twirl for you.

Distance between the keywords. I mean distance on the actual page itself. Does Google find "Super Mario Brothers" near the word "Mario" on a lot of the documents where the two occur, or are they relatively far away? Maybe Super Mario Brothers does appear with cat food a lot, but they're quite far away. They might look at citations and links between documents in terms of, boy, there's a lot pages on the web, when they talk about Super Mario Brothers, they also link to pages about Mario, Luigi, Nintendo, etc.

They can look at the anchor text connections of those links. They could look at co-occurrence of those words biased by a given corpi, a set of corpuses, or from certain domains. So they might say, "Hey, we only want to pay attention to what's on the fresh web right now or in the blogosphere or on news sites or on trusted domains, these kinds of things as opposed to looking at all of the documents on the web." They might choose to do this in multiple different sets of corpi.

They can look at queries from searchers, which is a really powerful thing that we unfortunately don't have access to. So they might see searcher behavior saying that a lot of people who search for Mario, Luigi, Nintendo are also searching for Super Mario Brothers.

They might look at searcher clicks, visits, history, all of that browser data that they've got from Chrome and from Android and, of course, from Google itself, and they might say those are corpi that they use to connect up words and phrases.

Probably there's a whole list of other places that they're getting this from. So they can build a very robust data set to connect words and phrases. For us, as SEOs, this means a few things.

If you're targeting a keyword for rankings, say "Super Mario Brothers," those semantically connected and related terms and phrases can help with a number of things. So if you could know that these were the right words and phrases that search engines connected to Super Mario Brothers, you can do all sorts of stuff. Things like inclusion on the page itself, helping to tell the search engine my page is more relevant for Super Mario Brothers because I include words like Mario, Luigi, Princess Peach, Bowser, Nintendo, etc. as opposed to things like cat food, dog food, T-shirts, glasses, what have you.

You can think about it in the links that you earn, the documents that are linking to you and whether they contain those words and phrases and are on those topics, the anchor text that points to you potentially. You can certainly be thinking about this from a naming convention and branding standpoint. So if you're going to call a product something or call a page something or your unique version of it, you might think about including more of these words or biasing to have those words in the description of the product itself, the formal product description.

For an About page, you might think about the formal bio for a person or a company, including those kinds of words, so that as you're getting cited around the web or on your book cover jacket or in the presentation that you give at a conference, those words are included. They don't necessarily have to be links. This is a potentially powerful thing to say a lot of people who mention Super Mario Brothers tend to point to this page Nintendo8.com, which I think actually you can play the original "Super Mario Brothers" live on the web. It's kind of fun. Sorry to waste your afternoon with that.

Of course, these can also be additional keywords that you might consider targeting. This can be part of your keyword research in addition to your on-page and link building optimization.

What's unfortunate is right now there are not a lot of tools out there to help you with this process. There is a tool from Virante. Russ Jones, I think did some funding internally to put this together, and it's quite cool. It's nTopic.org. Hopefully, this Whiteboard Friday won't bring that tool to its knees by sending tons of traffic over there. But if it does, maybe give it a few days and come back. It gives you a broad score with a little more data if you register and log in. It's got a plugin for Chrome and for WordPress. It's fairly simplistic right now, but it might help you say, "Is this page on the topic of the term or phrase that I'm targeting?"

There are many, many downloadable tools and libraries. In fact, Code.google.com has an LDA topic modeling tool specifically, and that might have been something that Google used back in the day. We don't know.

If you do a search for topic modeling tools, you can find these. Unfortunately, almost all of them are going to require some web development background at the very least. Many of them rely on a Python library or an API. Almost all of them also require a training corpus in order to model things on. So you can think about, "Well, maybe I can download Wikipedia's content and use that as a training model or use the top 10 search results from Google as some sort of training model."

This is tough stuff. This is one of the reasons why at Moz I'm particularly passionate about trying to make this something that we can help with in our on-page optimization and keyword difficulty tools, because I think this can be very powerful stuff.

What is true is that you can spot check this yourself right now. It is very possible to go look at things like related searches, look at the keyword terms and phrases that also appear on the pages that are ranking in the top 10 and extract these things out and use your own mental intelligence to say, "Are these terms and phrases relevant? Should they be included? Are these things that people would be looking for? Are they topically relevant?" Consider including them and using them for all of these things. Hopefully, over time, we'll get more sophisticated in the SEO world with tools that can help with this.

All right, everyone, hope you've enjoyed this addition of Whiteboard Friday. Look forward to some great comments, and we'll see you again next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |October 31st, 2014|MOZ|0 Comments