Blog

Happy birthday, Photoshop: 25 tributes to the software that remade everything

Photoshop-box-shots-through-the-years
Feed-fb

It all started, as many institutions in our modern world did, with a bunch of Star Wars-loving geeks

John Knoll, a supervisor at George Lucas' visual effects house, Industrial Light and Magic, was intrigued by an image-editing Mac app his academic brother Thomas was working on as a hobby, and started showing it around ILM. The computer graphics folks loved how it did most everything their high-end Pixar machines could do, but for a fraction of the price. It was used in the special effects for James Cameron's The Abyss

The Knolls realized they had a hot piece of software on their hands, and set about developing it for release. John had originally called it "Display." The brothers eventually renamed it ...

More about Adobe, Videos, Photoshop, Adobe Photoshop, and Tech

By |February 19th, 2015|Apps and Software|0 Comments

Danger Zones: 4 Things You Need to Know when Testing Emails

Posted by ahpromes

Remember back in January, when we asked you to help us run an experiment with the Marketing Experiments Blog testing the effectiveness of different email subject lines? The results are in, and we have a subject line winner! We'll talk about the test methodology and the winning submission, but before getting to that, I wanted to go over some of the common pitfalls and danger zones when it comes to email subject line testing (and, really, testing in general). Think of it like this:


(Image licensed from Getty Images)

Boundary #1: Make sure you're measuring the right thing

Generally speaking, the impact that email subject lines have on the performance of an email campaign is concentrated on open rate; more effective and intriguing subject lines drive more opens. This is because the subject line is the primary thing that you see when you make it to your inbox – and how much of that subject line a reader will or won't see is heavily influenced by that individual's choices in how they've set up their browser and reading panes.

Using my own email accounts as a visual example, you can see that the Gmail inbox can be generous; here it's showing up to 63 characters of the subject line and body text:

My Outlook web interface cuts at 52 characters, although this is heavily influenced by my setup – because my reading pane is set to "right," (vs. "bottom," or "off," Outlook's other two choices), I have less screen devoted to email previews and can see fewer subject line characters.

My Yahoo! Mail setup is the least generous, cutting subject lines at 49 charcters (but let's be real; it's unlikely that many of your potential customers are still using Yahoo! Mail).

If this is giving you the sneaking suspicion that email subject line length also has an influence on email subject line effectiveness, you're right. In our subject line test, we have line lengths ranging from 38 characters to 94 characters. The best performing subject line, in terms of driving the highest open rate? Smack in the middle at 51 characters.

Does this mean 51 characters is the ideal, maximum subject line length? Not necessarily. Too short can be an issue as well, as too few characters means fewer words at your disposal to entice an open and convey meaning. The three best performing subject lines in this test (average of 17.5% opened) averaged 51 characters long; the three with the lowest open rates (average of 15.9% opened) averaged 71 characters long. The two control group subject lines (average of 16.4% opened), at our shortest 38 characters, landed squarely in the middle in terms of open rate.

Boundary #2: If email subject lines only influence open rates, why should I track clicks?

An email subject line can also impact overall click-to-open rate for an email. This, by the way, is a better measure for performance than click-through rate alone: A high click-through rate but a lower click-to-open rate means that your body copy is strong but that you have opportunity to drive even more traffic by modifying your subject line for better open rates, thus increasing the size of the audience exposed to your awesome body copy.

A subject line sets up an expectation in the mind of the email reader of what is to come; how well the actual content of the email delivers against this expectation leads to either reader satisfaction or disappointment. Strong email subject line-content alignment generally leads to more clicks vs. a subject line that poorly represents the body content of the email.

I can illustrate this with an example of an email test that I ran years ago while working at an online travel company (without all of the specific numbers, which are proprietary), where we tested different subject lines offering varying percent discounts on the purchase of our products. Our test went something like this, but with a dozen or so different test cells sent to millions of customers:

  • Subject Line 1: Get 15% off vacation packages!
  • Body of Email 2: Blah, blah, blah, Get 15% off vacation packages!
  • Subject Line 2: Open to discover your vacation package discount!
  • Body of Email 2: Blah, blah, blah, Get 15% off vacation packages!
  • Subject Line 3: [etc.]

What we learned was that we had better click-to-open rates on the emails where we had strong subject-body agreement, like in example 1; where we had vague subject lines we could drive a lot of interest (read: opens), but our body content seemed to disappoint in that our click-to-open rates were lower than in our matchy-matchy test cells.

For this VolunteerMatch email test, the body copy of all emails was identical except for one sentence; that one sentence had four different variations that were written to map to the six test (and one control) subject lines.

Our highest click-to-open rate (6.3%) in this email test, " Volunteering matters: We have the proof." was also the subject line that delivered the highest click-through rates (1.08%), even though it placed only second in terms of overall opens (17.3%). This indicates that the body copy of the email delivered on the promise of the subject line pretty well, and that an area of opportunity here would be to work on increasing overall opens (e.g., more potential people to click).

Our highest open rate subject line (18.2%), " The volunteer app your coworkers will talk about" did not win in terms of either overall clicks (0.98%) or click-to-open rate (5.4%). This tells me two things:

  • The email body copy did not do a strong job of delivering on the expectations set by the subject line, and
  • The more I can refine that body copy to closely match the expectations set by the subject line, the more likely I am to drive total clicks.

Boundary #3: Are you measuring or categorizing tangible things?

I call this the "specious lens" test. When you're looking at test results, be wary about what you use to classify or categorize your results. The subject line character length category is a tangible thing, perceivable by both testers and email recipients. Let's look at some other subject line classifications for this email test to see if anything else has a real impact on open rates:

  • Use of special characters (e.g., punctuation marks)
  • Use of title case vs. sentence case

Both use of special characters and use of case are tangible to customers. But from the chart above, you can see that there really isn't any correlation between either of these classifications and higher (or lower) open rates. The best performing subject line and one of the test's bottom three both excluded any kind of punctuation. Same for case; both the highest and worst performing subject lines used sentence case. Neither of these classifications appear to have any real, measurable impact, in this example, on customer email open rates.

If you are applying value categorizations to your test results, however, you need to be especially wary when trying to draw conclusions; this is because the value categories that you create are less likely to be tangibly perceptible by your customers. If I group the tested subject lines by the value or sentiment that they primarily convey, I create the following four buckets:

  • Focuses on Caring as a sentiment
  • Focuses on Mobile App
  • Focuses on Quantifiable results
  • Focuses on Values (Good/Bad)

If you are classifying your test results based on you or your team's value judgments, as I did here, and you can't see any performance difference between your classifications, as is true here, ask yourself, "Are these classifications tangible to the customer? Do they fail to have a real impact on outcomes, or are they simply not real?"

In this case, my answer is, "It's not that value or sentiment don't have an impact on outcomes, it's that these sentiment classifications are likely not perceptible to the customer and thus aren't a valid way in which to categorize this test." It's also risky to classify results after you already know the test outcomes; this can lead to you fitting a hypothesis to the test results vs. letting your test results prove or disprove your hypothesis.

Boundary #4: Statistics is your friend (i.e. math is important)

The last boundary to be aware of is statistics. Run all of your results data through some kind of statistical tool to make sure that the variations you're seeing between your test segments are more than just random background noise. There are a lot of factors that go into determining statistical significance, such as overall sample sizes, overall "action" rates, the differences between action rates, and how confident you'd like to be in your results (e.g., it's often easier to measure the difference between 1.1% and 0.1% than it is to measure the difference between 101% and 100%).

For this test, I've mentioned several times that two control emails were used. These both went to approximately the same number of people (36,000), and had identical subject lines and identical body copy. These two segments had similar, but not identical, overall open rates of 16.4% and 16.5%. In order to make sure that overall results are valid and there is no unintentional selection skew when creating (what should be random) segments, it's imperative to make sure that the variation between these two control segments is nothing other than random noise.

In the chart below, you can see that these slight variations in open rate between the two test cells are not statistically significant; a very important signal that the total data set from the test is valid, too.

If you don't have your own stats or analytical resources to help you with this last step, there are a lot of great tools and worksheets online to get you started, including the one that I've used here, from http://visualwebsiteoptimizer.com/

And now to the contest results!

The methodology

First things first, let's go over what was actually tested:

  • 6 subject line "test cells" that each received a different email subject line
  • 2 subject line "control cells" that received the same email subject line
  • Just under 36,000 emails delivered to each test and control cell
  • 287,117 emails delivered, overall
  • Email body copy differed by one sentence in each test cell; otherwise was identical

Metrics recorded included:

  • Emails delivered
  • Email opens
  • Email clicks

These three metrics were then used to calculate:

  • Open rate (opens / delivered)
  • Click-through rate (clicks / delivered)
  • Click-to-open rate (clicks / opens)

The actual subject lines that were used in the test, along with all of the corresponding metrics:

Spread the Only "Good" Office Virus was used as the subject line for the two control cells (why use two control cells? The Marketing Experiments Blog wrote up their takeaways from the experiment a few weeks ago, and you can read the details and rationale there).

The winning, reader-submitted subject line (that drove the highest rate of clicks) was submitted by Moz Blog reader Jeff Purdon, an In-House Web Marketing Specialist for a manufacturing company. Jeff wins a ticket to the MarketingSherpa Email Summit 2015 and a stay at the ARIA Resort in Las Vegas. Congratulations, Jeff!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |February 19th, 2015|MOZ|0 Comments

UK banks RBS, Natwest are now letting customers use Touch ID

455660506
Feed-fb

LONDON — For the first time in the UK, customers at two banks will be able to access their accounts using a fingerprint.

Royal Bank of Scotland has introduced the service Wednesday for its RBS and Natwest customers who use an iPhone 5s, 6 and 6 Plus, which all have Apple's Touch ID technology.

Nearly half of RBS' 15 million customers use online banking, with three million of these accessing the services via the company's mobile apps

For the 880,000 RBS customers with Touch ID-enabled iPhones, they can opt to log into their accounts almost instantly with their fingerprints instead of typing in passwords ...

More about Uk, Apple, Banks, Tech, and Apps Software

By |February 18th, 2015|Apps and Software|0 Comments

Six takeaways from that epic Jony Ive profile

Ive-bono-newson
Feed-fb

Ian Parker's exhaustive 17,000-word profile of Jony Ive in The New Yorker is must-read material.

If you haven't already read the profile — unprecedented in its access to Ive and the Apple design studio — do it now. It's the best look we've had at the design processes within Apple, as well as at the man at the center of it all.

There is a lot in the profile to unpack. As John Gruber put it, "this is a resource we'll refer to for decades to come."

Here are just a few highlights from this truly extraordinary piece ...

Playmobil Ive

More about Apple, Design, Jony Ive, Tech, and Apps Software

By |February 18th, 2015|Apps and Software|0 Comments

Hiring for SEO: How to Find and Hire Someone with Little or No Experience

You're hired.

Posted by RuthBurrReedy

SEO is a seller's market. The supply of people with SEO experience is currently no match for the demand for search engine marketing services, as anyone who has spent months searching for the right SEO candidate can tell you. Even in a big city with a booming tech scene (like Seattle, LA, New York, or Austin), experienced SEOs are thin on the ground. In a local market where the economy is less tech-driven (like, say, Oklahoma City, where I work), finding an experienced SEO (even one with just a year or two of experience) is like finding a unicorn.

You're hired. (Photo via Pixabay)

If you're looking for an in-house SEO or someone to run your whole program, you may have no choice but to hold out for a hero (and think about relocating someone). If you're an SEO trying to grow a team of digital marketers at an agency or to expand a large in-house team, sometimes your best bet is to hire someone with no digital marketing experience but a lot of potential and train them.

However, you can't plug just anyone into an SEO role, train them up right and have them be fantastic (or enjoy their job); there are definite skills, talents and personality traits that contribute to success in digital marketing.

Most advice on hiring SEOs is geared toward making sure they know their stuff and aren't spammers. That's not really applicable to hiring at the trainee level, though. So how can you tell whether someone is right for a job they've never done? At BigWing, we've had a lot of success hiring smart young people and turning them into digital marketers, and there are a few things we look for in a candidate.

Are they an aggressive, independent learner?

Successful SEOs spend a ton of time on continued learning—reading blogs, attending conferences and webinars, discussing and testing new techniques—and a lot of that learning happens outside of normal work hours. The right candidate should be someone who loves learning and has the ability to independently drive their ongoing education.

Ask job candidates about another situation where they've had to quickly pick up a new skill. What did they do to learn it? How did that go? If it's never come up for them, ask what they might do in that situation.

Interview prep is something I always look for in a candidate, since it shows they're actually interested in the job. Ask what they've done to prep for the interview. Did they take a look at your company website? Maybe do some Googling to find other informational resources on what digital marketing entails? What did they learn? Where did they learn it? How did they find it?

Give your candidates some homework before the interview. Have them read the Beginner's Guide to SEO, maybe Google's Search Engine Optimization Starter Guide, or the demo modules at Distilled U. How much of it did they retain? More importantly, what did they learn? Which brings us to:

Do they have a small understanding of what SEO is and why we do it?

I've seen a lot of people get excited about learning SEO, do OK for a year or two, and then crash and burn. The number one cause of SEO flame-out or burn-out, in my experience, is an inability to pivot from old tactics to new ones. This failure often stems from a fundamental lack of understanding of what SEO is (marketing, connecting websites that have stuff with people who want that stuff) and what it is not (any single SEO tactic).

It can be frustrating when the methods you originally learned on, or that used to work so well, dry up and blow away (I'm looking at you, siloing and PageRank sculpting). If you're focused on what tricks and tactics can get you ranking #1, instead of on how you're using digital techniques to market to and connect with potential customers, sooner or later the rug's going to get pulled out from under you.

Ask your candidates: what did they retain from their research? Are they totally focused on the search engine, or have they thought about how visits can turn into revenue? Do they seem more interested in being a hacker, or a marketer? Some people really fall in love with the idea that they could manipulate search engines to do what they want; I look for people who are more in love with the idea of using the Internet as a tool to connect businesses with their customers, since ultimately your SEO client is going to want revenue, not just rankings.

Another trait I look for in the interview process is empathy. Can they articulate why a business might want to invest in search? Ask them to imagine some fears or concerns a small business owner might have when starting up an Internet marketing program. This is especially important for agency work, where communicating success requires an understanding of your client's goals and concerns.

Can they write?

Photo via Pixabay

Even if you're looking to grow someone into a technical SEO, not a content creator, SEO involves writing well. You're going to have to be able to create on-page elements that not only communicate topical relevance to search engines but also appeal to users.

This should go without saying, but in my experience definitely doesn't: their resume should be free of typos and grammatical errors. Not only is this an indicator of their ability to write while unsupervised, it's also an indicator of their attention to detail and how seriously they're taking the position.

Any kind of writing experience is a major plus for me when looking at a resume, but isn't necessarily a requirement. It's helpful to get some idea of what they're capable of, though. Ask for a writing sample, and better yet, look for a writing sample in the wild online. Have they blogged before? You'll almost certainly be exchanging emails with a candidate before an interview—pay attention to how they communicate via email. Is it hard to tell what they're talking about? Good writing isn't just about grammar; it's about communicating ideas.

I like to give candidates a scenario like "A client saw traffic to their website decline because of an error we failed to detect. We found and corrected the error, but their traffic numbers are still down for the month," and have them compose a pretend email to the client about what happened. This is a great way to test both their written communication skills and their empathy for the client. Are you going to have to proofread their client emails before they go out? That sounds tedious.

How are their critical thinking and data analysis skills?

A brand-new digital marketer probably won't have any experience with analytics tools like Google Analytics, and that's OK—you can teach them how to use those. What's harder to teach is an ability to think critically and to use data to make decisions.

Have your candidates ever been in a situation where they needed to use data to figure out what to do next? What about tell a story, back up a claim or change someone's mind? Recent college grads should all have recent experience with this, regardless of their major—critical thinking and data analysis are what college is all about. How comfortable are they in Microsoft Excel? They don't have to love it, but if they absolutely loathe it, SEO probably isn't for them. Would it make them miserable to spend most of a day in a spreadsheet (not every day, but fairly regularly)?

Are they a citizen of the web?

Even if they've never heard of SEO, a new employee is going to have an easier time learning it if they're already pretty net savvy. An active web presence also indicates a general interest in the the Internet, which is one indicator of whether they'll have long-term interest in digital marketing as a field. Do some recon: are they active on social media? Have they ever blogged? What comes up when you Google them?

Prior experience

Different applicants will have different backgrounds, and you'll have the best idea of what skills someone will need to bring to the table to fill the role you need. When I'm reading a resume, I take experience in any of these areas as a good sign:

  • Marketing
  • Advertising
  • Public relations
  • APIs (using them, creating apps with them, what have you)
  • Web development or coding of any kind
  • Web design
  • Copywriting

Your mileage may vary

Photo via Knowyourmeme

Very few candidates are going to excel in all of the areas outlined above, and everyone you talk to is going to be stronger in some areas than others. Since digital marketing can include a wide variety of different tasks, keep in mind the things you'd actually like the person to do on the job; for example, written communication becomes somewhat less important in a non-client-facing role. At the very least, look for a smart, driven person who is excited about digital marketing as a career opportunity (not just as a next paycheck).

Hiring inexperienced people has its risks: the person you hire may not actually turn out to be any good at SEO. They may have more trouble learning it than you anticipated, and once they start doing it, they may decide that SEO just isn't what they want to do long-term.

On the other hand, hiring and training someone who's a great fit for your company culture and who is excited about learning often results in a better employee than hiring someone with experience who doesn't really mesh well with your team. Plus, teaching someone SEO is a great way to make sure they don't have any bad habits that could put your clients at risk. Best of all, you have the opportunity to unlock a whole career for someone and watch them grow into a world-class marketer—and that's a great feeling.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |February 18th, 2015|MOZ|0 Comments