MOZ

Videos from MozTalk: Blogger Edition

Posted by CharleneKate

Have you ever noticed how Rand is often speaking at conferences all around the world? Well, we realized that those of us here in Seattle rarely get to see them. So we started MozTalks, a free event here at the MozPlex.

It normally runs 2-3 hours with multiple speakers, one of whom is Rand. The event is hosted at the Moz HQ and offers time for mingling, appetizers, refreshments and of course, swag. The series is still evolving as we continue to test out new ideas (maybe taking the show on the road), so be on the lookout for any updates.

Our most recent MozTalk back in September was a smashing success. Rand and his wife Geraldine, widely known as The Everywhereist, were our featured speakers, and the event focused on blogging, driving traffic to your site, and finding your online personality.

Wouldn't it be amazing if driving traffic and building your blog's brand were easy? You launch your blog, you publish awesome content, your metrics go through the roof and everyone just absolutely loves you. Bada bing, bada boom! We all know, however, that the Web is a crazy beast, and the number of individuals constantly sharing their thoughts, stories, expertise, and experiences can be overwhelming. Not to mention that search engine optimization itself has become considerably more advanced and challenging over the years.

So how do you stand out from the millions of personalities, blogs, tweets, and other search results? In the presentations below, Rand and Geraldine dive in and offer tips and tricks on how to drive traffic to your site and get your readers to fall in love with you.

Rand: What Bloggers Need to Know About SEO in 2014


Geraldine: How to Make Your Audience Fall in Love with Your Blog

Top three takeaways

As someone who's in the midst of launching my own personal blog, these particular takeaways really resonated with me:

Give SEO some love. For many marketers, SEO is a no-brainer, but often times it's harder to convince bloggers that SEO is important. The fact is, search continues to grow massively. There are more than 6 billion searches performed every day, and guess what? 80% of clicks go to organic results. That's where SEO comes in. Here's a little secret: You actually don't have to be an SEO expert to be effective; you just have to be kinda good at it.

Be authentic. Connect with your audience through familiarity and by being genuine. This will not only help you grow a loyal and dedicated following, but create a bond between you and the readers. There's no better voice for you than your own.

Be patient. If something's not working, don't panic. Traffic doesn't happen overnight but if you stick to your guns and stay true to your efforts then the results will be rewarding. Also, don't be afraid to switch it up and try new things if you have to.

Join us for the next one

We've got our next MozTalk scheduled for Tuesday, November 18th with Rand and Dr. Pete Meyers, who joins us all the way from Chicago! We'll be sure to let folks know once we have the videos of the talks on the blog, but for now, we hope to see you there!

Join the next free MozTalk


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 13th, 2014|MOZ|0 Comments

The Danger of Crossing Algorithms: Uncovering The Cloaked Panda Update During Penguin 3.0

Posted by glenngabe

Penguin 3.0 was one of the most anticipated algorithm updates in recent years when it rolled out on October 17, 2014. Penguin hadn't run for over a year at that point, and there were many webmasters sitting in Penguin limbo waiting for recovery. They had cleaned up their link profiles, disavowed what they could, and were simply waiting for the next update or refresh. Unfortunately, Google was wrestling with the algo internally and over twelve months passed without an update.

So when Pierre Far finally announced Penguin 3.0 a few days later on October 21, a few things stood out. First, this was not a new algorithm like Gary Illyes had explained it would be at SMX East. It was a refresh and underscored the potential problems Google was battling with Penguin (cough, negative SEO).

Second, we were not seeing the impact that we expected. The rollout seemed to begin with a heavier international focus and the overall U.S impact has been underwhelming to say the least. There were definitely many fresh hits globally, but there were a number of websites that should have recovered but didn't for some reason. And many are still waiting for recovery today.

Third, the rollout would be slow and steady and could take weeks to fully complete. That's unusual, but makes sense given the microscope Penguin 3.0 was under. And this third point (the extended rollout) is even more important than most people think. Many webmasters are already confused when they get hit during an acute algorithm update (for example, when an algo update rolls out on one day). But the confusion gets exponentially worse when there is an extended rollout.

The more time that goes by between the initial launch and the impact a website experiences, the more questions pop up. Was it Penguin 3.0 or was it something else? Since I work heavily with algorithm updates, I've heard similar questions many times over the past several years. And the extended Penguin 3.0 rollout is a great example of why confusion can set in. That's my focus today.

Penguin, Pirate, and the anomaly on October 24

With the Penguin 3.0 rollout, we also had Pirate 2 rolling out. And yes, there are some websites that could be impacted by both. That added a layer of complexity to the situation, but nothing like what was about to hit. You see, I picked up a very a strange anomaly on October 24. And I clearly saw serious movement on that day (starting late in the day ET).

So, if there was a third algorithm update, then that's three potential algo updates rolling out at the same time. More about this soon, but it underscores the confusion that can set in when we see extended rollouts, with a mix of confirmed and unconfirmed updates.

Penguin 3.0 tremors and analysis

Since I do a lot of Penguin work, and have researched many domains impacted by Penguin in the past, I heavily studied the Penguin 3.0 rollout and published a blog post based on analyzing the first ten days of Penguin 3.0 which included some interesting findings for sure.

And based on the extended rollout, I definitely saw Penguin tremors beyond the initial October 17 launch. For example, check out the screenshot below of a website seeing Penguin impact on October 17, 22, and 25.

But as mentioned earlier, something else happened on October 24 that set off sirens in my office. I started to see serious movement on sites impacted by Panda, and not Penguin. And when I say serious movement, I'm referring to major traffic gains or losses all starting on October 24. Again, these were sites heavily dealing with Panda and had clean link profiles. Check out the trending below from October 24 for several sites that saw impact.

A good day for a Panda victim:


A bad day for a Panda victim:


And an incredibly frustrating day for a 9/5 recovery that went south on 10/24:

I saw this enough that I tweeted heavily about it and included a section about Panda in my Penguin 3.0 blog post. And that's when something wonderful happened, and it highlights the true beauty and power of the internet.

As more people saw my tweets and read my post, I started receiving messages from other webmasters explaining that they saw the same exact thing, and on their websites dealing with Panda and not Penguin. And not only did they tell me about, they showed me the impact.

I received emails containing screenshots and tweets with photos from Google Analytics and Google Webmaster Tools. It was amazing to see, and it confirmed that we had just experienced a Panda update in the middle of a multi-week Penguin rollout. Yes, read that line again. Panda during Penguin, right when the internet world was clearly focused on Penguin 3.0.

That was a sneaky move Google… very sneaky. :)

So, based on what I explained earlier about webmaster confusion and algorithms, can you tell what happened next? Yes, massive confusion ensued. We had the trifecta of algorithm updates with Penguin, Pirate, and now Panda.

Webmaster confusion and a reminder of the algo sandwich from 2012

So, we had a major algorithm update during two other major algorithm updates (Penguin and Pirate) and webmaster confusion was hitting extremely high levels. And I don't blame anyone for being confused. I'm neck deep in this stuff and it confused me at first.

Was the October 24 update a Penguin tremor or was this something else? Could it be Pirate? And if it was indeed Panda, it would have been great if Google told us it was Panda! Or did they want to throw off SEOs analyzing Penguin and Pirate? Does anyone have a padded room I can crawl into?

Once I realized this was Panda, and started to communicate the update via Twitter and my blog, I had a number of people ask me a very important question:

"Glenn, would Google really roll out two or three algorithm updates so close together, or at the same time?"

Why yes, they would. Anyone remember the algorithm sandwich from April of 2012? That's when Google rolled out Panda on April 19, then Penguin 1.0 on April 24, followed by Panda on April 27. Yes, we had three algorithm updates all within ten days. And let's not forget that the Penguin update on April 24, 2012 was the first of its kind! So yes, Google can, and will, roll out multiple major algos around the same time.

Where are we headed? It's fascinating, but not pretty

Panda is near real-time now

When Panda 4.1 rolled out on September 23, 2014, I immediately disliked the title and version number of the update. Danny Sullivan named it 4.1, so it stuck. But for me, that was not 4.1… not even close. It was more like 4.75. You see, there have been a number of Panda tremors and updates since P4.0 on May 20, 2014.

I saw what I was calling "tremors" nearly weekly based on having access to a large amount of Panda data (across sites, categories, and countries). And based on what I was seeing, I reached out to John Mueller at Google to clarify the tremors. John's response was great and confirmed what I was seeing. He explained that there was not a set frequency for algorithms like Panda. Google can roll out an algorithm, analyze the SERPs, refine the algo to get the desired results, and keep pushing it out. And that's exactly what I was seeing (again, almost weekly since Panda 4.0).

When Panda and Penguin meet in real time…

…they will have a cup of coffee and laugh at us. :) So, since Panda is near-real time, the crossing of major algorithm updates is going to happen. And we just experienced an important one on October 24 with Penguin, Pirate, and Panda. But it could (and probably will) get more chaotic than what we have now. We are quickly approaching a time where major algorithm updates crafted in a lab will be unleashed on the web in near-real time or in actual real time.

And if organic search traffic from Google is important to you, then pay attention. We're about to take a quick trip into the future of Google and SEO. And after hearing what I have to say, you might just want the past back…

Google's brilliant object-oriented approach to fighting webspam

I have presented at the past two SES conferences about Panda, Penguin, and other miscellaneous disturbances in the force. More about those "other disturbances" soon. In my presentation, one of my slides looks like this:

Over the past several years, Google has been using a brilliant, object-oriented approach to fighting webspam and low quality content. Webspam engineers can craft external algorithms in a lab and then inject them into the real-time algorithm whenever they want. It's brilliant because it isolates specific problems, while also being extremely scalable. And by the way, it should scare the heck out of anyone breaking the rules.

For example, we have Panda, Penguin, Pirate, and Above the Fold. Each was crafted to target a specific problem and can be unleashed on the web whenever Google wants. Sure, there are undoubtedly connections between them (either directly or indirectly), but each specific algo is its own black box. Again, it's object-oriented.

Now, Panda is a great example of an algorithm that has matured to where Google highly trusts it. That's why Google announced in June of 2013 that Panda would roll out monthly, over ten days. And that's also why it matured even more with Panda 4.0 (and why I've seen tremors almost weekly.)

And then we had Gary Illyes explain that Penguin was moving along the same path. At SMX East, Gary explained that the new Penguin algorithm (which clearly didn't roll out on October 17) would be structured in a way where subsequent updates could be rolled out more easily. You know, like Panda.

And by the way, what if this happens to Pirate, Above the Fold, and other algorithms that Google is crafting in its Frankenstein lab? Well my friends, then we'll have absolute chaos and society as we know it will crumble. OK, that's a bit dramatic, but you get my point.

We already have massive confusion now… and a glimpse into the future reveals a continual flow of major algorithms running in real-time, each that could pummel a site to the ground. And of course, with little or no sign of which algo actually caused the destruction. I don't know about you, but I just broke out in hives. :)

Actual example of what (near) real-time updates can do

After Panda 4.0, I saw some very strange Panda movement for sites impacted by recent updates. And it underscores the power of near-real time algo updates. As a quick example, temporary Panda recoveries can happen if you don't get out of the gray area enough. And now that we are seeing Panda tremors almost weekly, you can experience potential turbulence several times per month.

Here is a screenshot from a site that recovered from Panda, didn't get out of the gray area and reentered the strike zone, just five days later.

Holy cow, that was fast. I hope they didn't plan any expensive trips in the near future. This is exactly what can happen when major algorithms roam the web in real time. One week you're looking good and the next week you're in the dumps. Now, at least I knew this was Panda. The webmaster could tackle more content problems and get out of the gray area… But the ups and downs of a Panda roller coaster ride can drive a webmaster insane. It's one of the reasons I recommend making significant changes when you've been hit by Panda. Get as far out of the gray area as possible.

An "automatic action viewer" in Google Webmaster Tools could help (and it's actually being discussed internally by Google)

Based on webmaster confusion, many have asked Google to create an "automatic action viewer" in Google Webmaster Tools. It would be similar to the "manual actions viewer," but focused on algorithms that are demoting websites in the search results (versus penalties). Yes, there is a difference by the way.

The new viewer would help webmasters better understand the types of problems that are being impacted by algorithms like Panda, Penguin, Pirate, Above the Fold, and others. Needless to say, this would be incredibly helpful to webmasters, business owners, and SEOs.

So, will we see that viewer any time soon? Google's John Mueller addressed this question during the November 3 webmaster hangout (at 34:54).

John explained they are trying to figure something out, but it's not easy. There are so many algorithms running that they don't want to provide feedback that is vague or misleading. But, John did say they are discussing the automatic action viewer internally. So you never know…

A quick note about Matt Cutts

As many of you know, Matt Cutts took an extended leave this past summer (through the end of October). Well, he announced on Halloween that he is extending his leave into 2015. I won't go crazy here talking about his decision overall, but I will focus on how this impacts webmasters as it relates to algorithm updates and webspam.

Matt does a lot more than just announce major algo updates… He actually gets involved when collateral damage rears its ugly head. And there's not a faster way to rectify a flawed algo update than to have Mr. Cutts involved. So before you dismiss Matt's extended leave as uneventful, take a look at the trending below:

Notice the temporary drop off a cliff, then 14 days of hell, only to see that traffic return? That's because Matt got involved. That's the movie blog fiasco from early 2014 that I heavily analyzed. If Matt was not notified of the drop via Twitter, and didn't take action, I'm not sure the movie blogs that got hit would be around today. I told Peter from SlashFilm that his fellow movie blog owners should all pay him a bonus this year. He's the one that pinged Matt via Twitter and got the ball rolling.

It's just one example of how having someone with power out front can nip potential problems in the bud. Sure, the sites experienced two weeks of utter horror, but traffic returned once Google rectified the problem. Now that Matt isn't actively helping or engaged, who will step up and be that guy? Will it be John Mueller, Pierre Far, or someone else? John and Pierre are greatly helpful, but will they go to bat for a niche that just got destroyed? Will they push changes through so sites can turn around? And even at its most basic level, will they even be aware the problem exists?

These are all great questions, and I don't want to bog down this post (it's already incredibly long). But don't laugh off Matt Cutts taking an extended leave. If he's gone for good, you might only realize how important he was to the SEO community after he's gone. And hopefully it's not because your site just tanked as collateral damage during an algorithm update. Matt might be running a marathon or trying on new Halloween costumes. Then where will you be?

Recommendations moving forward:

So where does this leave us? How can you prepare for the approaching storm of crossing algorithms? Below, I have provided several key bullets that I think every webmaster should consider. I recommend taking a hard look at your site now, before major algos are running in near-real time.

  • Truly understand the weaknesses with your website. Google will continue crafting external algos that can be injected into the real-time algorithm. And they will go real-time at some point. Be ready by cleaning up your site now.
  • Document all changes and fluctuations the best you can. Use annotations in Google Analytics and keep a spreadsheet updated with detailed information.
  • Along the same lines, download your Google Webmaster Tools data monthly (at least). After helping many companies with algorithm hits, that information is incredibly valuable, and can help lead you down the right recovery path.
  • Use a mix of audits and focus groups to truly understand the quality of your site. I mentioned in my post about aggressive advertising and Panda that human focus groups are worth their weight in gold (for surfacing Panda-related problems). Most business owners are too close to their own content and websites to accurately measure quality. Bias can be a nasty problem and can quickly lead to bamboo-overflow on a website.
  • Beyond on-site analysis, make sure you tackle your link profile as well. I recommend heavily analyzing your inbound links and weeding out unnatural links. And use the disavow tool for links you can't remove. The combination of enhancing the quality of your content, boosting engagement, knocking down usability obstacles, and cleaning up your link profile can help you achieve long-term SEO success. Don't tackle one quarter of your SEO problems. Address all of them.
  • Remove barriers that inhibit change and action. You need to move fast. You need to be decisive. And you need to remove red tape that can bog down the cycle of getting changes implemented. Don't water down your efforts because there are too many chefs in the kitchen. Understand the changes that need to be implemented, and take action. That's how you win SEO-wise.

Summary: Are you ready for the approaching storm?

SEO is continually moving and evolving, and it's important that webmasters adapt quickly. Over the past few years, Google's brilliant object-oriented approach to fighting webspam and low quality content has yielded algorithms like Panda, Penguin, Pirate, and Above the Fold. And more are on their way. My advice is to get your situation in order now, before crossing algorithms blend a recipe of confusion that make it exponentially harder to identify, and then fix, problems riddling your website.

Now excuse me while I try to build a flux capacitor. :)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 12th, 2014|MOZ|0 Comments

Developing Innovative Content: What You Need to Know

Posted by richardbaxterseo

A few weeks ago, I attended a breakfast meeting with a bunch of entrepreneurs in the technology, space (yes, space travel), software and engineering industry. I felt so blown away by the incredible talent of the speakers. You know, there are people out there building things, like private satellite networks, bio printing facilities, quantum computers and self-driving cars. I was completely transfixed by the incredibly future facing, innovative and exceptionally inventive group in front of me. I also immediately wished I'd worked a little harder in my twenties.

After the presentations, one of the questions that came up during the Q&A session was: "what's the next big thing?"

Wow. Have you ever thought about "the next big thing"?

Part of the magic of predicting innovation is that it's really, really hard to get right. Those that can accurately predict the future (in my humble opinion) are those that tend to understand how people will respond to an idea once they're exposed to it. I think predicting this is a very special skill indeed.

Then again, we're expected to be able to predict the outcome of our marketing, all the time. While predicting it is one thing, making it happen it is a whole different ball game.

Competition for the attention of our customers is getting tougher

In our industry, when you really boil down what it is we do, we're fixing things, making things, or we're communicating things.

Most of the time, we're building content that communicates: ideas, stories, news and guidance–you get the idea. The problem is, no matter which vertical you work in, we're all competing for something: the attention of our customers.

As our customers get smarter, that competition is getting tougher and tougher.

The most successful marketers in our industry all have a special trait in common. They are good at finding new ways to communicate ideas. Take a look at classic presentations like this from Ross Hudgens to see just how powerful it can be to observe, imitate and develop an idea with astounding viral reach.

I particularly enjoy the idea of taking a piece of content and making improvements, be it through design, layout or simply updating what's there. I like it because it's actually pretty easy to do, and there's growing evidence of it happening all over the Internet. Brands are taking a second look at how they're developing their content to appeal to a wider audience, or to appeal to a viral audience (or both!).

For example; take a look at this beautiful travel guide to Vietnam (credit: travelindochina.com) or this long form guide to commercial property insurance (credit: Towergate Insurance / Builtvisible.com) for examples of brands in competitive verticals developing their existing content. In verticals where ordinary article content has been done to death, redeveloping the medium itself feels like an important next step.

Innovative isn't the same thing as technical

I've felt for a long time that there's a conflict between our interpretation of "innovative" and "technical". As I've written before, those that really understand how the web works are at a huge advantage. Learn how it's built, and you'll find yourself able to make great things happen on your own, simply by learning and experimenting.

In my opinion though, you don't have to be able to learn how to build your own site or be a developer. All you have to do is learn the vocabulary and build a broad understanding of how things work in a browser. I actually think we all need to be doing this, right now. Why?

We need more innovation in content marketing

I think our future depends on our industry's ability to innovate. Of course, you still need to have your basics in place. We'll always be T-Shaped marketers, executing a bit of technical SEO here, a bit of content strategy there. But, we're all SEOs and we know we need to acquire links, build audiences and generally think big about our ambitions. When your goal is to attract new followers, fans, links, and garner shares in their thousands, you need to do something pretty exciting to attract attention to yourself.

The vocabulary of content development

I've designed this post to be a primer on more advanced features found in innovative content development. My original MozCon 2014 presentation was designed to educate on some of the technologies we should be aware of in our content development projects and the process we follow to build things. We'll save process for another post (shout in the comments if you think that would be useful!) and focus on the "what" for now.

At Builtvisible, we're working hard on extending our in-house content development capabilities. We learn through sharing amazing examples with each other. Our policy is to always attempt to deconstruct how something might have been developed, that way, we're learning. Some of the things we see on the web are amazing–they deserve so much respect for the talent and the skills that surface the content.

Here are some examples that I think demonstrate some of the most useful types of approach for content marketers. I hope that these help as much as they've helped us, and I hope you can form a perspective of what innovative features look like in more advanced content development. Of course, do feel welcome to share your own examples in the comments, too! The more, the merrier!

The story of EBoy

eBoy: the graphic design firm whose three co-founders and sole members are widely regarded as the "godfathers" of pixel art.

The consistent styling (as well as the beautifully written content) is excellent. Technically speaking, perhaps the most clever and elegant feature is the zoom of the image positioned on the Z axis in a container (more on this in a moment).

An event listener (jQuery) helps size the canvas appropriate to the browser window size and the z axis position shifts on scroll to create an elegant zoom effect.

View the example here: http://www.theverge.com/2014/6/17/5803850/pixel-perfect-the-story-of-eboy.

is an HTML element which can be used to draw graphics using scripting (usually JavaScript). This can, for instance, be used to draw graphs, make photo composition or simple animations.

Colorizing the past

Take a look at Pixart Printing's Guide to Colourizing the Past (credit: Pixartprinting / Builtvisible.com) for a clever example of in use. Here's one of the images (tip, mouse-over and click the image):

The colorization feature takes advantage of the power of the canvas element. In this case, the color version of the image is applied to the canvas as a background image, with the black and white version on a layer above. Clicking (or touching, on mobile) erases portions of the top image, revealing the color version underneath.

Chrome Experiments: Globe

Globe is "simple" global data visualization of the Earth's population growth over a set range of dates. The 3d visualization based in webGL: a JavaScript API for rendering interactive 3D graphics and 2D graphics within any compatible web browser without the use of plug-ins.

View the example here: http://globe.chromeexperiments.com/.

WebGL is a really exciting, emerging option available to content marketers who might want to experiment with immersive experiences or highly interactive, simulated environments.

Some of my favourite WebGL examples include Hello Racer and Tweetopia, a 3d Twitter Hastag visualizer.

If you'd like to see more examples of webGL in action, take a look at Chrome Experiments. Don't worry, this stuff works in the latest versions of Firefox and IE, too.

Polygon's PS4 Review

You might have seen me cover this long form concept over at Builtvisible. Polygon's Playstation 4 review is a fully featured "long form" review of Sony's much loved gaming machine. The bit that I love is the SVG visualizations:

"What's SVG?", I hear you ask!

SVG is super-fast, sharp rendering of vector images inside the browser. Unlike image files (like .jpg, .gif, .png), SVG is XML based, light on file size, loads quickly and adjusts to responsive browser widths perfectly. SVG's XML based schema lends itself to some interesting manipulation for stunning, easy to implement effects.

View Polygon's example here: http://www.polygon.com/a/ps4-review.

That line tracing animation you see is known as path animation. Essentially the path attribute in the SVG's XML can be manipulated in the DOM with a little jQuery. What you'll get is a pretty snazzy animation to keep your users eyes fixated on your content and yet another nice little effect to keep eyeballs engaged.

My favourite example of SVG execution is Lewis Lehe's Gridlocks and Bottlenecks. Gridlocks is a AngularJS, d3.js based visualization of the surprisingly technical and oft-misunderstood "gridlock" and "bottleneck" events in road traffic management.

It's also very cool:

View the example here:http://setosa.io/blog/2014/09/02/gridlock/.

I have a short vocabulary list that I expect our team to be able to explain (certainly these questions come up in an interview with us!). I think that if you can explain what these things are, as a developing content marketer you're way ahead of the curve:

  • HTML5
  • Responsive CSS (& libraries)
  • CSS3 (& frameworks)
  • JavaScript (& frameworks: jQuery, MooTools, Jade, Handlebars)
  • JSON (api post and response data)
  • webGL
  • HTML5 audio & video
  • SVG
  • HTML5 History API manipulation with pushState
  • Infinite Scroll

Want to learn more?

I've amassed a series of videos on web development that I think marketers should watch. Not necessarily to learn web development, but definitely to be able to describe what it is you'd like your own content to do. My favourite: I really loved Wes Bos's JS + HTML5 Video + Canvas tutorial. Amazing.

Innovation in content is such a huge topic but I realize I've run out of space (this is already a 1,400 word post) for now.

In my follow up, I'd like to talk about how to plan your content when it's a little more extensive than just an article, give you some tips on how to work with (or find!) a developer, and how to make the most of every component in your content to get the most from your marketing efforts.

Until then, I'd love to see your own examples of great content and questions in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 11th, 2014|MOZ|0 Comments

Quick & Easy Guide to Tracking Across Multiple Domains & Subdomains in Google Analytics

Posted by Tom.Capper

Out of the box, Google Analytics handles being deployed across multiple domains or subdomains extremely poorly. This is easily the most common critical problem in Google Analytics, despite its being relatively easy to fix.

Depending on your situation, one or more of a few simple steps may be appropriate. Look for the entry in the left-hand column below that best describes your situation, and make sure you've taken the steps listed on the right:

Situation Implementation Check-list
Single subdomain
  • Standard Google Analytics
Multiple subdomains or domains, which are treated as separate sites
Multiple subdomains on a single domain which are treated as a single site
Multiple domains with one or more subdomains that are treated as a single site

As a word of warning, several steps in this document differ according to the tracking code in use, and in these cases I suggest options for each tracking code type. If you're unsure of your current implementation:

  • ga.js / doubeclick.js: Your source code will contain several "_gaq.push" commands
  • analytics.js tracking code: Your source code will contain "ga('create'" and "ga('send'" commands
  • Google Tag Manager: You have an analytics tag in your Google Tag Manager account (which I will assume is set to "Universal Analytics")

If you have updated your Google Analytics interface to Universal Analytics but you're still using the old code, you should follow the recommendations for the old (ga.js / doubleclick.js) tracking code here.

Using separate tracking IDs

Tracking IDs are the unique codes that you're given when you create a Google Analytics property, and look something like "UA-123456-1". Any page with that tracking ID, regardless of the site it's on, will send data to that property.

While it is possible to use the same tracking ID across multiple domains or subdomains and then view them each in isolation using filtered views, the only advantage of doing so is having access to one aggregated view. For the data in this aggregated view to be meaningful, it will need to ignore self-referrals, and this is configured at the property level, meaning that all views will ignore self-referrals, thus leaving the (sub)domain-specific views with a load of "direct" traffic that actually came from sister sites.

This means that you end up choosing between incorrect data in your aggregate view and incorrect data in your specific view. If you do want to be able to have meaningful data in both specific and aggregate views, you could consider having one tracking ID that's used across all sites and additional tracking IDs for each individual site. For details on implementation, check Google's guidelines here (and also here if you use Google Tag Manager).

Ignoring self-referrals

A "self-referral" is when one of the sources of traffic to your own site is your own site. They make it very difficult to work out what channels are being effective in driving conversions, because they leave you with missing data for some sessions.

Self referrals don't just screw up your attribution data. They also trigger new sessions, thus ruining your key metrics and making it extremely hard to track the routes individuals take through your site. Fortunately, they're really easy to deal with.

If you have the old ga.js (or doubleclick.js) tracking code, simply add your domains as ignored referrers in your tracking code:

If you need to ignore multiple domains using ga.js or doubleclick.js tracking code, add multiple lines like this one. In either case, make sure that they come between the "setAccount" and "trackPageview" lines.

If you're using analytics.js tracking code, it's even easier:

Navigate to Admin -> Tracking Info -> Referral Exclusion list, and you can add any referrers you want to ignore. Note that although this feature can appear in your Google Analytics user interface even if you're using the old ga.js tracking code, it will only work with analytics.js.

Prepend hostname to request URIs

A "hostname" is the name that Google Analytics gives to the subdomain that a pageview originated from. Request URIs are the names you see in reports when you set a dimension like "landing page", "page" or "previous page path".

Any view that includes data from multiple domains or subdomains runs the risk of aggregating data from multiple pages and considering them the same page. For example, if your site includes "blog.example.com/index.html" and "example.com/index.html", these will be merged in reports under "/index.html", and you'll never have any idea how effective or otherwise your blog and homepage are.

You can overcome this using an advanced filter:

In the example, this means that we'd see "www.example.com/index.html" as a page in reports, rather than just "/index.html", and metrics that rely on telling the difference between the pages will report their real levels.

Ga.js / doubleclick.js only: Set domain name

For users of the new analytics.js tracking code or a Universal Analytics tag in Google Tag Manager, this step is unnecessary: Unless configured to do otherwise, the cookie is now automatically stored at the highest level possible so as to avoid being subdomain-specific. However, when using the old tracking code, Google Analytics needs a cookie location to be set in the tracking code so that it doesn't lose it when moving between subdomains.

All this means in practice is a simple additional line in your tracking code, between the "_setAccount" and "_trackPageview" lines:

This should always be set to your domain without any subdomain - e.g. moz.com, distilled.net - not www.moz.com or www.distilled.net.

Cross-domain linking

By default, Google Analytics looks for a cookie on the same domain as the page. If it doesn't find one, it assumes that a new visit has just begun, and starts a new session. When moving between domains, the cookie cannot be transferred, so information about the session must be passed by "decorating" links with tracking information.

Don't panic; this recently got dozens of times easier with the advent of the autoLink plugin for analytics.js. If your site spans multiple domains and you're not already using Google's latest analytics tracking code, this feature should justify the upgrade on its own.

If you can't upgrade for any reason, I won't cover the necessary steps for the old ga.js tracking code in this post, but you can find Google's documentation here.

If you're using on-page analytics.js tracking code, you can implement the autoLink plugin by making some modifications to your tracking code:

  1. Tells analytics.js to check whether the linker parameter exists in the URL and is less than 2 minutes old
  2. Loads the autoLink plugin
  3. The autoLink command is passed domains and two parameters. The first sets whether the linking parameters are in the anchor (rather than the query) portion of the URL, and the second enables form decoration (as well as link decoration).

In Google Tag Manager, it's easier still, and just requires two additional options in your Universal Analytics tag:

In conclusion

Setting up analytics to properly handle multiple domains or subdomains isn't difficult, and not bothering will invalidate your data. If you have any questions or tips, please share them in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 10th, 2014|MOZ|0 Comments

4 Tips for Producing Great Event Coverage – Whiteboard Friday

4 Tips for Producing Great Event Coverage Whiteboard

Posted by kanejamison

Conferences and trade shows can be sources of wonderful ideas, and covering these events in a way that spreads some of those ideas around is common practice. Not all event coverage is created equal, though, and in today's Whiteboard Friday, Kane Jamison details four areas you should keep in mind as you spread the wealth of knowledge.

For reference, here's a still of this week's whiteboard!

Video transcription

Hey, Moz fans. My name is Kane Jamison. I'm the founder of Content Harmony, and today I want to talk to you about four tips for producing really great event coverage. Specifically, I'm thinking of going to trade shows, conferences, those types of events and doing coverage for your company that's focused on your industry, your clients, or whoever you might be wanting to attract.

1) What type?

The first thing when you get into this that you really need to decide is what type of coverage you are going to focus on. What most people first think of is doing live tweeting or live blogging. Both of those are all right. I have a couple of problems with them. Live tweeting is really short-lived. It's great. You can build some followers, but unless you put it into Storify and then a blog post or something of that nature, it's gone. It's just in your tweet history.

Live blogging has a different problem. It's there. It's easy to access, but it's your notes, and it's not fun to read other people's notes. Unless you are really good at encapsulating what the speaker is saying and putting it into a narrative as you're typing, which most people are not, then it's just going to look like a bunch of bullet points and somebody's notes. I don't really enjoy reading those a lot of the time. I have done them both in the past and come across these problems.

2) Prepare everything!

The next stage that a lot of people will think of is what I would call a value-added recap. This is after the event, you go back and you write a narrative of what the themes were for the event that you were at, where your industry is trending, and you recap some highlights from individual speakers. This works really great. But usually after three days at a conference, I'm really lazy. I want to catch up on sleep that I've missed. I don't want to spend time writing 2,000 words about what happened at a conference that I just attended and putting all of my notes into a blog post. These can work out great. I'd refer you to Matt Gratt's, from BuzzStream, 2013 MozCon Recap. That's a favorite of mine for somebody who did a good job of pulling a lot themes together on an event recap.

What I prefer doing, and what we've done for MozCon at Content Harmony the last two years, is what I'd call live visuals or a visual recap. Live visuals, I mean Twitter images that are coming out on Twitter almost live with what the speaker is saying. A visual recap, another method we've used is putting quotes and speaker highlights into a SlideShare deck for each day of the event, so that users can look at those slides, paw through them, and see the event highlights in a visual format rather than trying to read a long form blog post. That's my favorite and what I'm really going to focus on today.

There's another fourth format that I have less experience with, but want to highlight, because if you have the manpower to tackle it, it's another great way to produce some visibility for you and your company, and that's just broader event coverage. A great example of this would be going to an event and filming Q&A and interview sessions with event attendees and maybe speakers as well. You might be talking with the speakers about what they are talking about on stage and kind of continuing it off the stage in a more casual format. You could just be asking people about their take on the speakers. Really, you're doing coverage that's less focused on what's being said on stage and more focused on who is there and what they think about everything. That's great, and it's a good way to meet people you want to talk to at the event as well.

As you're getting into something along the lines of live visuals or a visual recap post, you want to do your best to prepare everything that you can in advance. Specifically, you want to prepare everything except for what the speakers will actually say on stage. Anything that can be known in advance, you want to have that done, so that when you get there the first day, you can sit down, start typing notes into whatever your medium is and hit "Publish." You don't have to worry about formatting and all these other little quirks that come along with content assembly and creation.

The first thing, especially if you're going to be doing anything visual, is to have all of your graphics prepared in advance. For our coverage for MozCon 2014, we did live Twitter images. We had all of our Twitter images, everything except for the actual quote from the speaker, prepared in advance the week prior that we worked on with our graphic designer.

If you don't have a graphic designer, that's great. That's okay. There are easy ways to get around that without having a lot of design skills. My favorite is to just open up PowerPoint, use a nice looking color and big white or black font for your titles. Just type whatever you want into the slide. Right click on it, click "Save as Picture," and you can save that slide as a 4x3 JPEG, which works great for Facebook and Twitter, without having to pull some graphic designer in to help you. So it makes it really easy to produce nice looking visual coverage on the spot, save it, publish it, and you're good.

The next thing you want to do is pre-build your post. We like to host everything on one URL that people can tweet and share and come back to after the event. If you're doing this in WordPress or whatever blog, CMS, you want to pre-build everything that you can. For MozCon, what we've done is we'll have an introductory text about what it is. We'll have our image in the top right. We will have H2s down the page, marked up for the titles for every speaker name, their session title, and we'll have jump links created like you would see in a table of contents on Wikipedia. Somebody can go to the post, click on the speaker name, and they will go right down to whatever the notes or highlights are for that speaker.

We can build all of this the week in advance. We know what the speakers' names are. We know when they are going to be talking. We usually know the name of their session or what they're presenting on. All of this can be built out before we ever get in to the city or town where the event is actually happening. Getting all that done makes it a lot easier to sit down, start taking notes, and really do what matters, which is recording what the speakers are talking about.

The third thing you want to have handy while you're working is what I'd call a notes clipboard. This is just a quick, one-page text document that has all of the hashtags that you're going to use, all the URLs, like the short Bitly links to the posts that you're writing, and then finally micro-copy, so maybe 40 or 50 character type little bits that you will keep copying or pasting into Twitter or wherever else that you are sharing content. You know you're going to use this throughout the day.

The example for our recent MozCon coverage would be "See more MozCon coverage at" and then the short link to our post. MozCon was already a hashtag, so I know that it's going to be seen in that feed. Everything is all pre-built. All I have to do is around a hundred characters of custom content, add the photo, paste in our little suffix to the tweet, hit "Publish," and I'm good to go. I can move on to the next one. Having all of this prepped makes things a lot easier when you're actually there and live.

3) Buddy system (or automation if you don't have buddies with you)

The third thing you want to think about is how exactly you're going to take notes and record everything across a few days of speakers talking. The best way to do this to use a buddy system. Have one person that's taking notes, recording everything that's going on, taking down URLs, taking down quotes and tools mentioned by speakers, and have an opened, shared Google doc between the two of you so they can be taking notes in a bullet, and you can be taking those notes and publishing them either to the blog or to Twitter or wherever you might be doing the event coverage.

The backup option, if you don't have the buddy system, or even if you do and you want more comprehensive notes, is to automate the process. Zapier is a great tool, very similar to If This Then That, which most of you are familiar with from past MozCon content. Zapier allows you to take tweets for a specific hashtag and push them to a Google doc. Every time there is a new tweet, it will push it into a new row of a spreadsheet, and you've got full, live, automated robot notes coming through Twitter. If you miss a link that's shared, if you miss a quote, you can capture that from somebody else. If you do this, I highly recommend thanking the people on Twitter that helped you push through those notes, mentioning them in your posts.

The final thing, regardless of whether you've got a buddy or whether you're automating the process, is to just grab the speaker slides while they're talking. It's kind of cheating, but as long as you don't get ahead of yourself, it's a really easy way to rely on what the speaker's putting out in their slides. Whether they've tweeted a SlideShare link or mentioned a Bitly link on stage or whether the event has actually published a link to it, you can grab these, follow along while they're talking.

If you do this, you have to be careful not to get ahead of yourself and just start copying things from slides. You'll be sitting there. It'll seem easy to do so because it's right there and it's easy for you to get ahead. The problem with this and the danger is that you'll miss the context of what the speaker is actually saying. If you start putting out notes that are based off of the slides and not based off of the speaker and what they're actually saying, that's the fast track for danger and getting called out by somebody for publishing something that the speaker didn't intend, from what their slides may look like they meant to say. So be careful with that. Don't abuse it. But it's a great way to get backup notes while you're trying to take live quotes and coverage.

4) Optimize for the medium

The fourth and final thing that you need to focus on is optimizing for the medium. Specifically, the example I want to use is Twitter images, since that was our most recent focus. In advance, you want to create some kind of personal or fake Twitter account that you can do some testing on. You want to make sure that visuals are sized exactly the way they should be. After a lot of testing with our graphic designer and reading blog posts about ideal Twitter sizing, what we settled on, for MozCon 2014, was 880 pixels wide by 660 pixels tall, a 4x3 ratio, and an image that would scale down nicely on Twitter and look good.

What we did with that is create a header and footer that had information on MozCon and information about the speaker title and a URL to go find more information. In the center, we had the actual quote from the speaker, the speaker photo, and name. The reason we did this is from these cut lines you see, in feed when somebody is scanning through a hashtag or their own feed, they will see a smaller, cutoff version of the image that is more like a two to one ratio of width to height. We designed it in a way where they would see a nice slight border on the top and bottom of the image. The only thing they would see in their feed is the speaker, name, photo, and the actual quote from the speaker. The real substance, we're not forcing the MozCon imagery or our own logo and links on them each time they're looking through their feed. They're only seeing new stuff, even if they're seeing a lot of these images.

If they do click on the tweet or if somebody links them to the actual tweet URL, they'll see the full header and footer. They'll know where it came from. They'll know what the event was, and they'll know where they can find more similar images.

Another nice part about 880x660 for us was that this image size worked well on LinkedIn. It worked well on Facebook. So we could reuse the same image on other mediums as we were going as well.

The other part about other mediums is even if you're focusing on one, like Twitter, you need to optimize your actual posts across a number of mediums. LinkedIn, Facebook, Twitter, Pinterest all have their own graph metadata that goes into a post. You need to make sure, before the event even starts, that all of this is perfectly optimized in your CMS and that when you share this on different social networks, it's going to look great. People are going to want to share this content for you, and you want to break down all the barriers that are in their way to doing so. Make sure that all the descriptions look nice, titles aren't cut off, images are properly sized for each social network, and you'll have a lot better time getting coverage from industry peers and people that want to share that content.

Thanks for your time. I'd love to hear more feedback on what you think could improve a live event coverage and other tips and ideas in the comments, and have a good one Moz fans.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |November 7th, 2014|MOZ|0 Comments