Enterprise Conversion Optimization from WhichTestWon

Yes, Grammar Matters:(Vic Test)

Yes, Grammar Matters:

80% Fewer Conversions Because of Spelling Mistakes!

Can bad grammar and poor spelling really hurt your sales funnel? The jury is still out, as opinions seem mixed.

On the one hand, everyone knows language is flexible and so, too, are the rules that govern it. Even the Oxford Dictionary had the good humor and sense of reality to pick an ‘emoji’ as the 2015 Word of the Year.

On the other hand, though, the Internet is teeming with self-appointed grammar Nazis. Any CRO expert will tell you that proper grammar and spelling hurt your brand’s credibility and customers’ trust. And… you just know your 8th grade English teacher would take a red pen to some of your Twitter posts.

So, who’s right? After all, both opinions make sense. It’s true that good grammar is essential for professional-grade copywriting. But it’s just as true that the definition of good grammar changes along with language and the times we live in.

The problem is that neither side of the debate seems to have the empirical evidence to support their opinion.

Where’s the psychological research to argue in favor of language rule-bending?

Where are the A/B test results to support those claims that poor spelling directly correlates with fewer conversions?

We dug around for the studies and the conversion rate improvement data—you can find it all right below. But first, let’s frame the debate.

What is bad grammar, anyway?

The use of ‘bad’ grammar isn’t always misguided or inappropriate. On the contrary, it can serve as a very effective sales strategy.

Take, for instance, GOP presidential hopeful, Donald Trump. Political issues aside, Trump’s language has been analyzed to gauge for comprehension level. The results: the Donald speaks at a fourth grade level. And, as this video shows, he often phrases his sentences awkwardly, in order to emphasize certain key words.

Trump may bend the rules of language, but he is a billionaire whose popularity as a politician has been soaring. Why? Probably because he manages to adapt his speech to the context of his audience. In marketing terms, his brand speaks the language of his targeted market segment.

The same applies to social media-savvy brand Denny’s, whose tweets often forego capitalization and punctuation. Thus, they manage to emulate the writing styles of their predominantly young customers.

01 denny-s-twitter-profile

The takeaway here is that grammar and spelling rules should be observed as is appropriate in a given context. If your brand is fun, hip, loud, and/or appeals to the young on social media, some rule-bending is almost recommended. However, also bear the following in mind:

  •        Involuntarily misspellings won’t make you seem hip, just careless about details.
  •        Customers will pay attention to details, when they’re on your ecommerce site, trying to decide if they can trust you enough to make a purchase.
  •        The main goal of conversion optimization is more sales.

The above makes sense on an intuitive level. And while the correlations between spelling and business growth, profit, and conversion haven’t been researched in depth, plenty of lip service has been paid to the importance of trust in sales.

Is trust all it’s cracked up to be?

Yes, trust matters. Here’s what we mean by trust, how it affects sales, and why all CROs should pay attention to it.

How do we get to trust brands?

According to BJ Fogg, your website is a business card. It’s often the source of the user’s first impression of your brand and one of its major sources of revenue. Here are the kinds of trust/credibility a brand is grounded in:

–        Presumed credibility: a better known brand is presumed more trustworthy than one you’ve barely ever heard about before;

–        Reputed credibility: word-of-mouth marketing (what your friends think of a brand, its products, and services);

–        Surface credibility: what examining a brand’s website will tell you at first glance;

–        Earned credibility: your own experience with the brand—including any and all possible grammar and spelling mistakes.

Usability.gov also writes about building credibility here and explains that users are attuned to quickly spotting factors that may erode their trust. Interestingly enough, their findings have revealed that younger users (age 28 and below) tend to be harsher judges of typos, broken links, and other similar errors.

But is bad grammar really that bad for sales? The psychological research evidence can be conflicting.

On the one hand, BJ Fogg proposes a behavior model in which customers’ actions (such as purchases) are influenced by motivation, triggers, and ability. If it’s easy for them to make the purchase and they want the product badly enough, they’re not going to be put off by some bad spelling here and there.

04 bj-fogg-behavior-model

On the other hand, trust is hard to earn and important to keep. One crime rate analysis theory called The Broken Windows theory explains that people will tolerate small nuisances (broken windows, unkempt yards, spelling mistakes), but they’ll also wonder if they might actually point to a bigger red flag. Let’s apply this to ecommerce: your clients might not flee in horror when they spot 2 or 3 spelling mistakes. But they might wonder if your order fulfilment process is just as sloppy.

Furthermore, this paper explains that trust, together with perceived risk and perceived benefits, has more impact on the loyalty of online customers than does satisfaction. To illustrate, think of Old Navy and Topshop. The former brand ran a jersey line that featured the misspelled slogan “Lets go [Team Name]”, while the latter managed to spell Shakespeare’s name wrong, on what was supposed to be a tribute t-shirt.

 

Romeo T-Shirt.jpg

Both brands recalled the poorly spelled apparel, which likely cost them quite a few thousand dollars. And, to add insult to injury, they also lost valuable word-of-mouth marketing. This continues to snowball to this day, since what goes on the Web stays on the Web, as Adrian Snood writes for Social Media Today.

The correlation between spelling, grammar, and trust

Though split tests on the effect of bad spelling on conversions don’t exactly abound, there is a growing body of correlative evidence. Here are some relevant poll and survey results from recent years:

  •       Colour Works surveyed 1,700 online dating platform users, to find that 43% think bad grammar is unappealing. A corresponding 35% found good grammar to be highly appealing.
  •        Grammarly surveyed 100 LinkedIn profiles of native English speakers working in packaged goods. They had all worked for 3 employers, tops, in the first 10-year half of their career. Half of them (the ones with 2.5 times more grammar mistakes) had failed to be promoted to managerial positions after 10 years. Those who only got 1 to 4 promotions in 10 years made 45% more mistakes than those with 6 to 9 promotions. Also, those who stayed in the same job for 10 years made up to 20% more grammar mistakes than those who had had 6 jobs in 10 years.
  •       OKCupid’s analysis of response rates to first messages has revealed that messages with proper grammar and punctuation have 37% better odds at getting positive responses.
  •        A study undertaken at Clemson University found that the credibility of an academic writer is directly correlated with the accuracy of his/her language use.

06 clemson-university-accuracy-credibility-author

Can’t spell? Your audience will frown.

Predictably enough, the empirical research that examines online brand presence credibility correlated with spelling and grammar accuracy has confirmed the above. The short of it: users really dislike bad grammar.

  •       The Search Laboratory polled users and found that 80% value good spelling, layout, and grammar usage in online copywriting.
  •        UK research company Global Lingo also surveyed over 1,029 users to find that 59% of them wouldn’t use the products or services of a business whose site featured blatant grammar and spelling mistakes. 82% of users wouldn’t use a company with poorly translated materials. 74% say they notice and mind bad grammar.
  •        London-based agency Disruptive Communications found that 42.5% of the 1,003 users they surveyed are most influenced by bad grammar and spelling in brand posts. Significantly, users in the 18 to 24 demographic are far less influenced by this.

02 below-the-fold-brands-grammar-social-media

  •       Grammarly surveyed 6 major brands on LinkedIn: Coca Cola vs Pepsi, Google vs Facebook, Ford vs General Motors. In 2 of these 3 competitions, the brand with the fewer grammar and spelling mistakes got more engagement. Coca Cola, with 4 times fewer mistakes than Pepsi, got twice as many likes than the competition on 10 of their last posts. And Google had twice as many followers as Facebook, as well as 3 times more likes on their last 10 posts.
  •        The BBC famously reported the story of one Charles Duncombe, an online entrepreneur, whose experience with spotting an important spelling mistake has led him to believe that these cost the UK millions. Among other online stores, Duncombe runs one that sells tights… but was spelling it “tihgts”. Though search engines can identify wrongly spelled words and suggest correct versions, fixing the error manually allegedly helped improve the site’s conversions by 80%. Duncombe says a single mistake can cut online sales in half.

03 tightsplease-co-uk-screen-shot

 

In addition to conversion, spelling and grammar are, indeed, very important for SEO. Google wants to provide a good experience for users, so it makes sense that spelling and grammar play a role in SEO. To verify this, look no further than Google’s own Webmaster Central blog. In this article, they encourage webmasters to assess the quality of their pages and posts. Some questions the algorithms can allegedly answer include:

–        Are there style, factual, and/or spelling mistakes in the content?

–        Has the content been checked for quality?

–        Does the content appear to have been edited, or does it look sloppy?

–        How much attention to detail has been invested in producing the article?

And, if that’s not enough, check out what Matt Cutts has to say about it: pages with high PageRank, i.e. “reputable sites tend to spell better, and the sites that have lower PageRank, or very low PageRank, tend not to spell as well”.

The Senior Product Manager at Bing confirmed this approach in 2014. Duane Forrester then explained in an interview that search engines are judged based on the quality of the content they deliver. So, then, why would any major search engine run the risk of displaying poorly spelled, ungrammatical content?

If, not that long ago, SEO ‘experts’ were optimizing for misspellings on purpose, nowadays Google runs two algorithms that discount for such human searcher mistakes. One is ‘did you mean?’ and the other is ‘showing results for [presumed correct spelling]’.

There are also several signals that point to the usefulness of removing user-sourced spelling and grammar mistakes.

  •        One blogger reports removing all user comments to experience better page load times and rankings—presumably after also doing away with spam and poorly written comments.
  •        When hotel reviews on TripAdvisor and Travelocity are written using proper grammar, demand for that hotel increases. The same goes for products with reviews on Amazon. It doesn’t matter if the reviews are positive or negative: correct language will boost the bottom line regardless.
  •       Zappos has been using Amazon’s Mechanical Turk since 2009 to fix the spelling and grammar in its reviews. The monumental effort that examining 5 million reviews entails must signal it’s paying off for them financially.

How to deal with ‘bad grammar’ and misspellings

So far, we’ve established that bad copy, peppered with spelling mistakes and grammar errors can hurt sales. However, it doesn’t necessarily always do so. To help drive home the importance of context and message matching, let’s take a look at some dos and don’ts for writing in context.

  1.      Don’t write for your English teacher.

Write for your customers. Some of them will actually find it easier to connect with an informal, casual style. They will like contractions, abbreviations, and slang and unless they are academics, they’ll likely tolerate dangling prepositions and one-sentence paragraphs.

05 how-to-write-good-frank-l-visco

 

  1.   Do check for money leaks

All too often, an ecommerce website will be leaking money from tiny cracks in its veneers. To help avoid them, you can:

o   Implement a top-down page performance review system. Check out your best and worst performing pages each month. Make sure the copy is error-free, the images are sharp, and there’s no time reference that can date the content (e.g. ‘as seen on 2008 runways’ on a fast fashion eStore).

o   3.      Do test, test, test. Forget HIPPOs. Test for even the smallest of changes, instead of using the “throw a ‘bowl of pasta at the wall to see what sticks’ approach. A/B test results are your best bet at improving conversions.



February 6, 2016 | Posted by sumeenaa |   Click here to tweet this
Category:

Biggest, Best & Most of 2015

At WhichTestWon, our goal is to provide you with interesting, objective data about testing and conversion rate optimization.

We just did something really cool for you. We looked all the Tests of the Week published over the year (between Jan. 1 2015-Jan. 1 2016), and we analyzed them based on five factors:

  • Geographic location
  • Software used
  • Software used by geographic location
  • Test design
  • Conversion focus

Our findings are very interesting and we want to share them with you!

In this short report, we tell you where testers were focused, what software they used, and how they designed their tests*.

*It’s important to note that WhichTestWon publishes only the best tests submitted to us. We receive many entries, but selectively choose only those tests that meet our rigorous and stringent testing standards.

The data compiled below is a summary of the best of the best tests of 2015.

Here’s what we found:

  1. Tests by Geographic Location

Looking at our 2015 Test of the Week entries by geographic location, you’ll see the lion’s share (39%) of tests came from companies or agencies headquartered in the US.

Tests_By_Geographic_LocationUK (26%) and Europe (20%) were the next biggest countries submitting tests.Australia and New Zealand comprised a small slice (7%) of published Test of the Week case studies.Unfortunately, only a small percentage (4%) of published tests came from Asia. In the future, we’d love to see more Asian studies! If you’re in Asia and running tests, please hit us up andsubmit your testsfor publication.Other countries, like India, Israel, and Canada made up the remaining small percentage (4%) of countries submitting tests.In 2016, we hope to see an even more diversified collection of tests from, literally, all over the world!You can submit your studies anytime, anywhere through this easy-to-fill-out entry form.

  1. Software Used

To run a reliable, valid test, the software you use can make or break your study.

The majority of our published Test of the Week entries used one of three testing platforms.

Testing_Software_UsedAt 20% use, Visual Website Optimizer and Optimizely tied for top spot. Adobe Target came in at 13%.

Trailing further behind, 9% of testers used Google Content Experiments.

Interestingly, however, a significant chunk (15%) applied their own proprietary software or testing technologies. Creating proprietary testing tools seems to be a growing market. We anticipate this number will rise in 2016.

22% turned to ‘other’ tools and technologies to run their tests. What comprised ‘other’? Read on.

Other Software

Platforms like Marketo, HubSpot and Bounce Exchange accounted for a big chunk of the ‘other’ software providers used.Other_Testing_Software_UsedWe anticipate these digital marketing-focused platforms will continue gaining traction with testers in the coming years.

  1. Software by Geographic Location

Geographic location definitely influenced the type of software used. Certain countries exhibited a strong preference for particular testing tools.

Software_By_Geographic_Location

In Europe, the strong majority (56%) of testers used Visual Website Optimizer.
However, in the UK, Optimizely dominated the market, with 55% of testers using this competing tool.

In the US, Adobe Target (22%) was the most used software. In the States, VWO and Optimizely were equally popular; they tied at 11%.

In Asia and the Pacific region, there was no clear trend on testing software used. Likely because there were too few tests submitted to draw obvious conclusions.

However, interestingly, in India, our limited sample showed proprietary testing technology was the way to go. Of our Indian studies published, 100% used proprietary tools to run their tests.

  1. Test Design

In terms of test design, this year, the majority of published studies (13%) were focused either on button color/copy or were radical redesigns.

This finding is interesting because both button studies and radical redesigns tend to be viewed as some of the least optimal tests to run.

However, it’s important to note – we don’t choose the tests you run – we just publish them.

Despite our rigorous submission criteria, we can only publish the types of tests you submit to us. If many testers run – and submit – button tests and radical redesigns, these are the studies that will be featured as our Tests of the Week.

At WhichTestWon, we definitely feel there’s a need for testers to push the bounds and run more sophisticated tests. Our State of Online Testing Report addresses this finding in more depth.

A_B_Test_DesignAt 11%, tests looking at design elements like page layout, font, and text size, were the next most popular studies run by testers.

Additionally, studies featuring content offers, or those looking at navigation features, and using overlays to convert visitors were also heavily tested (9% each).

Very few tests looked at personalization or audience segmentation. But, we do hope to see this trend reverse in the coming year as testers and testing technologies become more sophisticated.

  1. Conversion Focus

In terms of the focus of our published tests, the majority (24%) looked at how to increase click-through rates. Although click-through rates are important, in the future, we hope to see more testers going further down the conversion funnel.

Final conversion funnel metrics like total sales, average order value, and revenue per visitor are highly important for a company’s bottom line. A big lift at the bottom of the funnel proves the value of testing and bodes well for your job.

This year, 9% of published tests focused their conversion metrics on sales. And, 5% looked at Revenue Per Visitor (RPV) or reservations/bookings.

Conversion_FocusMost tests (17%) focused on assessing leads, form fills (11%), and orders (11%).

As the testing industry matures, we hope to see more sophisticated tests looking further down the funnel.

In 2016, we look forward to seeing your submissions for upcoming Tests of the Week.

To submit your study for potential publication, simply click on this link and follow the easy steps to fill-out our submission form.

Thanks for being part of WhichTestWon. Best wishes for a very happy New Year!



January 4, 2016 | Posted by Andrea W |   Click here to tweet this
Category:

New MVT Test for Your Vote: Harry & David’s Site-Wide Header Test

Have you ever debated what should be included in your site’s header? This week’s header test by Harry & David’s ecommerce store increased purchases by a double digit amount. See if you can guess which header did the trick…
http://whichtestwon.com/archives/16442

There were seven total elements tested: search box elements such as outline, highlight, and color; search term copy and location; and the look of the header’s cart icon.
See the creative samples and guess which one won. Let us know your thoughts in the comments.
http://whichtestwon.com/archives/16442



May 23, 2012 | Posted by admin |   Click here to tweet this
Category: Other, Uncategorized

New A/B Test for Your Vote: Long Copy vs. Shorter Copy Landing Page Test

Landing page content is always a balancing act of not enough information vs. too much copy clutter. See whether longer or shorter copy increased sales by 62% on this landing page.

http://whichtestwon.com/archives/16365

This week’s case study is a clean A/B test – the versions are identical except one omits a FAQ section that had appeared below the fold. Note: Click twice on each images to blow them up to full size.

Can you guess WhichTestWon? Let us know your thoughts in the comments.
http://whichtestwon.com/archives/16365



May 16, 2012 | Posted by admin |   Click here to tweet this
Category: Landing Page Tests, Landing Pages, Uncategorized

New A/B Test for Your Vote: Surprising Ecommerce Page Results – Nav Bar Removal

Site navigation is something every website should be testing. This week’s Ecommerce Product Page test has some interesting results that should remind you to always be testing.
http://whichtestwon.com/archives/16321

This is a clean A/B test juxtaposing two product pages that are 100% identical except one version includes site navigation in the left hand margin and the other version does not.

Can you guess WhichTestWon? Let us know your thoughts in the comments.

http://whichtestwon.com/archives/16321



May 9, 2012 | Posted by admin |   Click here to tweet this
Category: eCommerce Tests, Uncategorized

New A/B Test for Your Vote: TRUSTe Security Seal vs. No Seal – Which Form Got More Submits?

Have you ever wondered how important trust icons are on your lead generation forms? We’ve wondered the same thing! Can you guess which form got the most submissions?

http://whichtestwon.com/archives/16273

This is a straight forward A/B test pitting a short form with no trust indicator against one with the TRUSTe insignia. This was the ONLY element that changed on the page.

What do you think? Does the trust indicator add ‘peace of mind’ or cause confusion to the site’s visitors?

http://whichtestwon.com/archives/16273



May 2, 2012 | Posted by admin |   Click here to tweet this
Category: Page Element Tests (Buttons, Images, Overlay, etc.), Uncategorized

New A/B Test for Your Vote: Award-winning Lead Generation Test – 39% Lift

Does a clean entrance page or additional site navigation increase leads for Dell’s cloud computing solution? Can you guess which test won in this 2012 award-winning test?
http://whichtestwon.com/archives/14582

This section of Dell’s site has over 60+ pages of content to digest. Dell hypothesized that the additional nav would increase leads,and ran this test in order to find out if it would lift conversions or add too much clutter.

What do you think? Did this additional nav increase leads or confusion. Vote and tell us what you think:

http://whichtestwon.com/archives/14582



April 25, 2012 | Posted by admin |   Click here to tweet this
Category: Award Winners, Lead Generation Tests, Uncategorized

New A/B Test for Your Vote: LinkedIn Subscription Offer Test – Does Dynamic Content or UI Familiarity Lift Conversions

This week’s test showcases a subscription offer page test by social media giant LinkedIn. Can you guess which of the radically redesigned pages increased subscription purchases by 20%?
http://whichtestwon.com/archives/16194

There are many differences between the two pages. The most notable is one page uses dynamic content to display specific call-outs based on user info & site location, and the other uses static content while maintaining the familiar LinkedIn UI.

Can you guess which page increased subscriptions purchases?
Vote and tell us what you think:
http://whichtestwon.com/archives/16194



April 18, 2012 | Posted by admin |   Click here to tweet this
Category: social, Uncategorized

New A/B Test for Your Vote: Homepage Test – Do Geo-Targeted Call-Outs Increase Conversion … or Clutter?

This week’s test is a must-view for all sites getting traffic from more than one country. If you market globally (or want to someday) see if you can guess which homepage won the test:
http://whichtestwon.com/archives/16150

The pages were identical except for the addition of a dynamic text call-out targeted at international visitors. The country’s name was displayed based on the visitor’s geolocation.

Is it better to have a one-size-fits-all homepage or should you include additional targeted call-outs on your site?
Vote and tell us what you think:
http://whichtestwon.com/archives/16150



April 11, 2012 | Posted by admin |   Click here to tweet this
Category: Uncategorized

New A/B Test for Your Vote: Text vs HTML Email – 303.8% Lift

Have you run an HTML vs. plain text email test lately? With the winning version here increasing revenue by 303.8% you may want to soon….  Can you guess which version it is?
http://whichtestwon.com/archives/15976

Both email versions shared an identical subject line, landing page, and offer, and both were sent to segments of the same list.

Here’s the link to vote – and share your comments:
http://whichtestwon.com/archives/15976



April 4, 2012 | Posted by admin |   Click here to tweet this
Category: Email Tests, Uncategorized

Next Page »