Blog

How to Avoid the Unrealistic Expectations SEOs Often Create – Whiteboard Friday

This Week's Whiteboard.

Posted by randfish

With all the changes we've seen in the field of SEO in recent years, we need to think differently about how we pitch our work to others. If we don't, we run the risk of creating unreal expectations and disappointing our clients and companies. In today's Whiteboard Friday, Rand explains how to set expectations that will lead to excitement without the subsequent let-down.

For reference, here's a still of this week's whiteboard!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat a little bit about the expectations that SEOs create and sometimes falsely create. It's not always our fault, but it is always our responsibility to fix the expectations that we create with our teams, our managers, our executives, and, if we're consultants, with our clients.

So here's the problem. This is a conversation that I see happen a lot of the time. Here's our friendly SEO guy over here, and he's telling his client, "Hey, if we can rank on page 1 for even 10% of all these terms that I've selected, we're going to drive a 500% increase in leads."

Here's the client over here, and she's thinking to herself, "That sounds amazing. 500% increase in leads, that's going to do wonderful things for my business. So let's invest in SEO. This is going to be great. We only have to get 10% of these keywords on there. I don't know anything about SEO, but that sounds totally possible." Six months later, after all sort of stymieing and challenging problems, here she is going, "You told me we'd increase our leads by 500%!"

There's the SEO saying, "Well yeah, but we have to get the rankings first, and we haven't done that yet. I said we'd get the leads once we got the rankings."

This kind of expectation and many others like it are a huge challenge. It is the case that modern SEO takes a lot of time to show results. Modern content marketing works the same way. You're not going to start producing blog posts or interactive content or big content pieces and 3 months from now go, "Well, we made 50 new content pieces, and thus our traffic has tripled."

That's not how it works. The problem here is that SEO just doesn't look like this anymore. It did, kind of, at one point. It really did.

We used to engage in an SEO contract. We'd make some changes to the existing pages, do some keyword targeting, some optimization, maybe fix things up that weren't SEO friendly on the site, get our link structure in order. Great. Do a little bit of link building to the right kinds of pages that we need on our site from the right kind of places. We'd get those rankings. Now we can easily prove the value of the search traffic that's coming through by looking at the keyword referrals in our analytics report, because keyword traffic is showing.

This process has been broken over the last five, six, seven years. But expectations have not caught up to where we are today. Modern SEO nowadays is really like this. You engage in that SEO contract, and then the SEO's job is to be much more than an SEO, because there are so many factors that influence modern search rankings and modern search algorithms that really a great SEO, in order to have impact, has to go, "All right, now we're going to start the audit."

The audit isn't going to look at which pages do you have on the site and what keywords do you want to match up and which ones do we need to fix, or just link structure or even things like schema. Well, let's look at the content and the user experience and the branding and the PR, and we'll check out your accessibility and speed and keyword targeting. We'll do some competitive analysis, etc. Dozens of things that we're going to potentially look at because all of them can impact SEO.

Yikes! Then, we're not done. We're going to determine which investments that we could possibly make into all of these things, almost all of which probably need some form of fixing. Some are more broken than others. Some we have an actual team that could go and fix them. Some of those teams have bandwidth and don't. Some of those projects have executives who will approve them or not. We're going to figure out which ones are possible, which ones are most likely to be done and actually drive ROI. Then we're going to work across teams and executives and people to get all those different things done, because one human being can't handle all of them unless we're talking about a very, very tiny site.

Then we're going to need to bolster a wide range of offsite signals, all of the things that we've talked about historically on Whiteboard Friday, everything from actual links to things around engagement to social media signals that correlate with those to PR and branding and voice and coverage.

Now, after months of waiting, if we've improved the right things, we'll start to see creeping up our rankings, and we'll be able to measure that from the traffic that pages receive. But we won't be able to say, "Well, specifically this page now ranks higher for this keyword, and that keyword now sends us this amount of traffic," because keyword not provided is taking away that data, making it very, very hard to see the value of visitors directly from search. That's very frustrating

This is the new SEO process. You might be asking yourself, "Given these immense challenges, who in the world is even going to invest in SEO anymore?" The answer is, well, people who for the last decade have made a fortune or made a living on SEO, people who are aware of the power that SEO can drive, people who are aware of the fact that search continues to grow massively, that the channel is still hugely valuable, that it drives direct revenue and value in far greater quantity than social media by itself or content marketing by itself without SEO as a channel. The people who are going to invest successfully, though, are those whose expectations are properly set.

Everybody else is going to get somewhere in here, and they're going to give up. They're going to fire their SEO. You know what one of the things that really nags at me is? Ruth Burr mentioned this on Twitter the other day. Ruth said, "When your plumber fails to fix your pipes, you don't assume that plumbing is a dead industry that no one should ever invest in. But when your SEO fails to get you rankings or traffic that you can measure, you assume all SEO is dead and all SEO is bad."

That sucks. That's a hard reality to live in, but it's the one that we do live in.

I do have a solution though, and the solution isn't just showing how this process works versus how old-school SEO works. It's to craft a timeline, an expectation timeline.

When you're signing a contract or when you're pitching a project, or when you're talking about, "Hey, this is what were going to do for SEO," try showing a timeline of the expectations. Instead of saying, "If we can rank on page one," say, "If we can complete our audit and fix the things we determine that need to be fixed and prioritize those fixes in the order we think they are, then we can make the right kinds of content investments, and then we can get the amplification and offsite signals that we need starting to appear and grow our engagement. Then we can expect great SEO results." Each one of these is contingent on the last one.

So six months later, your boss, your manger, or your client is going to say, "Hey, how did those content investments go?" You can say, "Well look, here's the content we've created, and this is how it's performing, and this is what we're going to do to change those performances." The expectation won't be, "Hey, you promised me great SEO." The promise was we're going to make these fixes, which we did, and we're going to complete that audit, which we did. Now we're working on these content investments, and here's how that's going. Then we're going to work on this, and then we're going to work on that.

This is a great way to show expectations and to create the right kind of mindset in people who are going to be investing in SEO. It's also a great way not to get yourself into hot water when you don't get that 500% increase 3 months or 6 months after you said we're going to start the SEO process.

All right everyone, I'd love to hear from you in the comments. Look forward to chatting it up and having a discussion about modern SEO and old-school SEO and expectations that clients and managers have got.

We will see you again, next week, for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |December 12th, 2014|MOZ|0 Comments

You can now pay for that Microsoft Xbox in bitcoins

Xbox_one_console-8
Feed-fb

It's now possible to pay for Microsoft products with Bitcoin — just in time for the holidays.

Microsoft customers in the United States can use the digital currency to add money to their online accounts and purchase software, apps and games, including those for Xbox consoles — like Xbox Music — and Windows phones or computers.

No official announcement had been made at the time of writing about the move to support Bitcoin, but Microsoft did quietly post a how-to guide on its website

Bitcoin payments for Microsoft products are currently only supported in the U.S.; users must first add bitcoins to their accounts before redeeming them for regulated U.S. dollars prior to making a purchase ...

More about Microsoft, Windows, Xbox, Tech, and Apps Software

By |December 11th, 2014|Apps and Software|0 Comments

Nokia Here maps find their way to Google Play

Nokia-here
Feed-fb

One of Nokia's strengths, before the major part of the company got acquired by Microsoft, was the fact that its devices came with a solid maps and navigation app called Here.

Now, on Wednesday, Nokia launched Here for Android devices on Google Play. The price? Free

If you've been using Google Maps — also a free, and often pre-installed, Android app — for your GPS navigation needs, you might wonder what would be the point of installing Nokia's offering. But while Maps is more about, well, maps, Nokia Here is much more akin to a standard, GPS, voice-guided navigation app — and those usually do not come free ...

More about Microsoft, Nokia, Tech, and Apps Software

By |December 11th, 2014|Apps and Software|0 Comments

HTTP/2: A Fast, Secure Bedrock for the Future of SEO

http11-basic

Posted by Zoompf

In prior articles, we've written extensively about website performance and securing your website, both factors Google has publicly announced as search ranking factors. These articles provide extensive tips using existing tools and technologies to improve your site performance and security (tips we highly recommend you follow). But did you know Google also developed and is championing a new web transport protocol called SPDY that addresses many of the inherent performance and security flaws in the web today?

In this article I will dive into more detail on how this new protocol works, why it is important to you, and how you can get started using it today.

From experiment to standard

Google created the SPDY protocol as a multi-year experiment to find a faster way for browser and servers to communicate. The results have been so positive that the Internet Engineering Task Force (IETF) is using SPDY as the basis for HTTP/2, a replacement to the current network protocol that powers all Internet web traffic today. While technically HTTP/2 is still an evolving specification, many web browsers, web servers, networking devices, and websites already support both SPDY and HTTP/2 in its current form.

While there are some subtle differences between SPDY and HTTP/2, for the purposes of this article it's safe to use those terms interchangeably. As HTTP/2 rises to prominence in the popular vocabulary, the SPDY vernacular will fall out of use in favor of HTTP/2. For this reason, I will simply refer to SPDY as HTTP/2 for the remainder of this article.

What problem is HTTP/2 trying to solve?

To understand why Google and the IETF are creating a new version of HTTP, we need to understand the fundamental performance limitations we have today. It helps to consider this analogy:

Imagine if all the roads in the modern world were built back during the age of horse drawn carriages: narrow, bumpy and with low speed limits (still true in some cities...). Sure it took a while to get anywhere, but the delay was mostly due to the speed of your horse. Flash forward to today: same bumpy roads, but now everyone is driving a car. Now the horse is not the bottleneck, but instead all those cars piling up on the same log jammed road!

Believe it or not, most website traffic today is not far from this analogy. The original HTTP protocol dates back nearly 25 years. The most recent update is HTTP/1.1 which was standardized back in 1999. That is a lifetime in Internet time!

Like those narrow, bumpy roads of yore, the web back then was a very different place: smaller web pages, slower Internet connections, and limited server hardware. In a sense, the "horse" was the bottleneck. HTTP/1.1 was very much a product of those times.

For example, when web browser loads a web page using HTTP/1.1 it can requests resource (like an image, JavaScript file, etc) one at a time, per connection to the server. It looks like this:

You'll notice the browser is spending a long time waiting on each request. While HTTP/1.1 won't let us make multiple requests at the same time over the same connection, browsers can try and speed things up by making two connections to the same server, as shown in the diagram below:

http11-multiple

Using two connections is a little better, but the browser still spends a lot of time waiting to get a download. And we can only download two resources at a time. We could try and making more connections to download more resource in parallel. Modern browsers try to do this and can make between 2-6 connections per server. Unfortunately this is still an poor approach, because each connection itself is used so inefficiently. Since the average web page has over 100 resources, the delay in making all those individual requests one at a time over just a few connections added up and your page loads slowly.

You can actually see this inefficiency by looking at a waterfall chart. We discussed waterfalls in a previous Moz post on optimizing Time To First Byte, and we also have a detailed guide on how to read waterfall charts. Most waterfall charts will show long green sections which represents the time the browser is waiting to download a resource. All that time wasted on waiting instead of downloading is a major reason why websites load slowly.

This inefficient waiting on resources is why optimizations like combining JavaScript or CSS files can help your site load faster. But optimizations like this are just stopgap measures. While you can (and should) continue to optimize our pages to make fewer and smaller requests, we're not going to truly evolve to the next level of performance until we "fix the roads" and improve the fundamental way in which the web communicates. Specifically, we need to find a better way to utilize those network connections.

This is where HTTP/2 comes in.

The solution: HTTP/2

At its core, HTTP/2 is about using the underlying network connections more efficiently. HTTP/2 changes how requests and responses travel on the wire, a key limitation in the prior versions of HTTP.

HTTP/2 works by making a single connection to the server, and then "multiplexing" multiple requests over that connection to receive multiple responses at the same time. It looks like this:

http2-multiplexing

The browser is using a single connection, but it no longer requests items one at a time. Here we see the browser receives the response headers for file #3 (maybe an image), and then it receives the response body for file #1. Next it starts getting the response body for file #3, before continuing on to file #2.

Think of multiplexing like going to the grocery store and calling your spouse just once to get the full list: "Okay we need milk, eggs, and butter. Check." Compare this to HTTP/1.1 which is like calling your spouse over and over: "Do we need milk? Okay, bye." "Hello me again—do we need eggs too? Yep, okay.", "Okay sorry one last question, do we need flour too? Nope, good."

All of that data is interwoven much more efficiently on that single connection. The server can supply the browser with data whenever it is ready. There is no more "make request; do nothing while waiting; download response" loop. While slightly more complex to understand, this approach has several advantages.

First of all, network connections don't sit idle while you are waiting on a single resource to finish downloading. For example, instead of waiting for one image to finish downloading before starting the next, your browser could actually finish downloading image 2 before image 1 even completes.

This also prevents what is known as head-of-line blocking: when a large/slow resource (say for example a 1 MB background image) blocks all other resources from downloading until complete. Under HTTP, browsers would only download one resource at a time per connection. HTTP/2's multiplexing approach allows browsers to download all those other 5 KB images in parallel over the same connection and display as they become available. This is a much better user experience.

Another great performance benefit of HTTP/2 is the "Server Push" feature: this allows the server to proactively push content to a visitor without them requesting it. So for example, when a browser visits your website, your server can actually "push" your logo image down to the browser before it even knows it needs it. By proactively pushing needed resources from the server, the browser can load pages much quicker then was previously possible.

Last, but not least: HTTP/2 works best with http. As we mentioned before, both performance and security are an ever increasing component of search ranking. While the HTTP/2 specification technically allows for use over non-http connections, Google's earlier SPDY protocol required http. For compatibility reasons, most web server software will only use HTTP/2 over an encrypted http connection. Getting on the http bandwagon not only protects the security of your users and is good for your search ranking, but also is the most effective way to adopt HTTP/2. For more information, see our prior post on enabling http.

The future, today!

So clearly HTTP/2 offers some great benefits for both speed and performance, but what does this mean to you right now? Well, you may be surprised to learn, HTTP/2 is already available, and can be supported by you without impacting your old users running on HTTP/1.1.

You can think of HTTP/2 just like any other protocol, or even a spoken language. For it to work, you just need an agreement from both the sender and receiver to speak the same language. In this case, the "sender" is the web browser and the receiver is your web server.

Browser support

Since it's unlikely you will create your own web browser like Microsoft, Google, Apple or Mozilla, you will not need to worry about the "sender" side of the equation. Support for HTTP/2 in the web browser is already in widespread use across the modern browsers of today, with adoption only increasing as older browser versions age out.

In fact, the latest versions of all the major desktop web browsers already support HTTP/2. Chrome and Firefox has supported it for several years. Apple added support to Safari in fall of 2014 with Safari 8. IE 11 supports HTTP/2, but only if you are running Windows 8.

Similarly, there is already widespread HTTP/2 adoption on smart phones as well. Android's older web browser, helpfully named Browser, has support HTTP/2 for several years. The current default browser for Android is Google's Chrome browser. Mobile versions of Chrome use the same networking code as Desktop Chrome. This means that both Chrome on Android devices, as well as Chrome on iOS devices, both support HTTP/2. Apple added support to the iOS version of Safari with iOS 8.

Your best best is to look at your website analytics and see what web browsers your visitors are using. Chances are, the majority of visitors have HTTP/2 capable web browsers (you can check against this list of desktop and mobile browsers that support HTTP/2). In that case, you can safely move on to the next step.

Web server support

While you have little control over which browsers your visitors use, you do have direct control over your web server. Put quite simply, to support HTTP/2 you need to select a web server that supports HTTP/2 and enable it. And of course, that server should also continue to support HTTP/1.1 as well because you will always have users using older browsers.

Continuing our "spoken language" analogy from before, you can think of HTTP/1.1 and HTTP/2 as different languages like English or French. As long as both parties can speak the same language, they can communicate. If your server only supports HTTP/1.1, then visitors can only speak to it with HTTP/1.1. But, if your server also supports HTTP/2, then your users browser will also choose to speak (the faster) HTTP/2. And finally if your server does speak HTTP/2, but your users browser does not, then they will continue to speak HTTP/1.1 just as before, so there's no danger in "breaking" your older users.

Right now, both the Apache and nginx web servers support HTTP/2. nginx supports HTTP/2 natively, and Apache supports it via the mod_spdy module. Since Apache and nginx serve traffic for 66% of all active web servers, chances are good that your website's server can support HTTP/2 right now.

If you aren't using nginx or Apache you still have other options. There are a number of smaller, more specialized projects that support HTTP/2. You can also place a reverse proxy that support HTTP/2 like HAProxy in front of your existing web server to get the same benefit as having a web server that directly supports HTTP/2.

If you run your site through a hosting provider, check with them to see which web server version they are running. Major sites like WordPress.com and CloudFlare all already offer HTTP/2 support. If your provider is not yet supporting HTTP/2, let them know this is important!

Adding HTTP/2 support

As I mentioned, HTTP/2 is simply another language your web server can use to communicate. Just as a person can learn a new language while remembering their mother tongue, your web server will continue to know how to communicate HTTP/1.1 after you add support for HTTP/2. You aren't in danger of shutting anyone out from speaking with your site. People using newer browsers will communicate using HTTP/2, and older browsers will continue using the older HTTP/1.1—nothing breaks. If you have the time, there really is no reason not to update your site to support HTTP/2.

Remember, HTTP/2 is just a better way to transmit web content than HTTP/1.1. Everything else about your website (the URLs, your HTML markup, your redirects or 404 pages, your page content, etc) all stays the same. This makes adding support for HTTP/2 fairly straight forward:

  1. Make sure your website is using http. See our previous article on implementing http without sacrificing performance.
  2. Verify your server software or infrastructure can support HTTP/2.
  3. Update and configure your server software or infrastructure to support HTTP/2.

That's it. Your website is now using HTTP/2.

Well hopefully it is. The steps involved to update/configure your website will vary depending on your what software you use, so we cannot provide you with detailed guide. However, we did built a free tool, SPDYCheck, which you can use to verify you have properly configured your website to HTTP/2 (aka SPDY). SPDYCheck works like a checklist, verifying each step of how a browser negotiates with your server to communicate via HTTP/2. It can tell you where in the process things are not working, and it also provides helpful recommendations like enabling Strict Transport Security. With SPDYCheck, you can be sure that everything is functioning properly, and verify that you site supports HTTP/2.

Conclusion

We all know that faster sites help improve search engine rankings, but faster sites also offer better user experiences. Faster sites engage your users longer, and promote sharing further sharing and linking. HTTP/2 is an amazing leap forward that can help improve the performance and user experience of your website. However, HTTP/2 is not a silver bullet. Optimizations like losslessly optimizing your website's images can have a big effect on your site's performance and will still be needed. In short, while you should add HTTP/2 support to your website, make sure you are doing other optimizations and following performance best practices to ensure the best possible user experience. If you are looking for a place to start, or want to see how your site is doing, Zoompf's free performance report is a great way to understand what you can do to make your website faster.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |December 11th, 2014|MOZ|0 Comments

Report: Injection, content duplication leads to lost ad revenue for top pubs

As it turns out, ad fraud knows no strangers.

A comprehensive study by the Association of National Advertisers, focusing heavily on bot traffic, illustrated the extent to which a surprise group of publishers were part of the ad fraud game.

Previously thought to be above the fray when it comes to ad fraud, it was discovered that a large percentage of it occurred on premium publisher sites.

Traffic from premium publishers consists of almost 20 percent bots. While much of that can be linked to sourced traffic, often it's the publishers who are victims of ad-fraud attacks, according to the study which was released by the ANA in conjunction with WhiteOps, a platform for reducing fraud traffic.

One of the top methods used to attack a website and use its legitimate traffic for fraud was ad injection, when a browser plug-in “inserts” ads into publishers' sites, without permission and without payment to those publishers. To a buyer that has whitelisted a particular domain, it looks like that domain's inventory when, in fact, it isn't. In many cases the ad may be hidden — a pop-up/under, an overlay, etc. — and the performance (or lack there of) is credited to that domain though the publisher of that domain has no control of or benefit from the impression(s).

One case study showed that more than 500,000 ads were injected daily on a top-tier publisher's site. Ads were found to have even been injected on subscriptions-based sites that promise an ad-free environment for users.

Injected ads also ruin the user experience, cause slower load times, and deliver an overabundance of ads (often highly-intrusive to the user). This type of ad injection is just as bad (if not worse) for the user as it is for the publisher, especially considering the infection is often rooted in the user's computer.

“(These) ads were displayed using malware illicitly installed on residential computers,” the report stated. “Victims of malware-driven ad injection inadvertently expose private information. Malware-driven ad injection software on the victim's computer allows potentially malicious, unknown actors to gain access to personally identifiable information (PII), including browsing history, interests, and financial information.”

Publishers receive no money for ads injected on their site, as the payment for those served ads goes instead to the operator of the malware.

One way publishers can protect themselves is through the use of true domain technology, which “identifies the actual domain on which an ad displays, rather than the domain reported by the ad server, which can be falsified,” the report said.

Copying content from premium sites and selling ads around that content is another form of ad fraud that harms publishers, where fraudsters set up bogus sites with stolen content that still manage to get some amount of human traffic (along with sourced traffic that also can include bots). Of the most bot-trafficked sites in the study, 22 percent were found to have had duplicated content. Fraudsters set up a website, and can sometimes rip off a legitimate publisher's content in bulk in order to draw search traffic away from the original publisher. Ads are served on that site, to bots and humans alike, and the ad revenue goes to the thief instead of the original publisher of the work.

The first step in protecting yourself from this type of fraud is to be aware if your content is being stolen. Hubspot has a great rundown about ways to find out. Taking it a step further, KissMetrics illustrates ways you can combat content theft.

Download a copy of the full report here.

By |December 10th, 2014|Advertising Technology|0 Comments