Yes, Grammar Matters:
80% Fewer Conversions Because of Spelling Mistakes!
Can bad grammar and poor spelling really hurt your sales funnel? The jury is still out, as opinions seem mixed.
On the one hand, everyone knows language is flexible and so, too, are the rules that govern it. Even the Oxford Dictionary had the good humor and sense of reality to pick an ‘emoji’ as the 2015 Word of the Year.
On the other hand, though, the Internet is teeming with self-appointed grammar Nazis. Any CRO expert will tell you that proper grammar and spelling hurt your brand’s credibility and customers’ trust. And… you just know your 8th grade English teacher would take a red pen to some of your Twitter posts.
So, who’s right? After all, both opinions make sense. It’s true that good grammar is essential for professional-grade copywriting. But it’s just as true that the definition of good grammar changes along with language and the times we live in.
The problem is that neither side of the debate seems to have the empirical evidence to support their opinion.
Where’s the psychological research to argue in favor of language rule-bending?
Where are the A/B test results to support those claims that poor spelling directly correlates with fewer conversions?
We dug around for the studies and the conversion rate improvement data—you can find it all right below. But first, let’s frame the debate.
The use of ‘bad’ grammar isn’t always misguided or inappropriate. On the contrary, it can serve as a very effective sales strategy.
Take, for instance, GOP presidential hopeful, Donald Trump. Political issues aside, Trump’s language has been analyzed to gauge for comprehension level. The results: the Donald speaks at a fourth grade level. And, as this video shows, he often phrases his sentences awkwardly, in order to emphasize certain key words.
Trump may bend the rules of language, but he is a billionaire whose popularity as a politician has been soaring. Why? Probably because he manages to adapt his speech to the context of his audience. In marketing terms, his brand speaks the language of his targeted market segment.
The same applies to social media-savvy brand Denny’s, whose tweets often forego capitalization and punctuation. Thus, they manage to emulate the writing styles of their predominantly young customers.
The takeaway here is that grammar and spelling rules should be observed as is appropriate in a given context. If your brand is fun, hip, loud, and/or appeals to the young on social media, some rule-bending is almost recommended. However, also bear the following in mind:
The above makes sense on an intuitive level. And while the correlations between spelling and business growth, profit, and conversion haven’t been researched in depth, plenty of lip service has been paid to the importance of trust in sales.
Yes, trust matters. Here’s what we mean by trust, how it affects sales, and why all CROs should pay attention to it.
According to BJ Fogg, your website is a business card. It’s often the source of the user’s first impression of your brand and one of its major sources of revenue. Here are the kinds of trust/credibility a brand is grounded in:
– Presumed credibility: a better known brand is presumed more trustworthy than one you’ve barely ever heard about before;
– Reputed credibility: word-of-mouth marketing (what your friends think of a brand, its products, and services);
– Surface credibility: what examining a brand’s website will tell you at first glance;
– Earned credibility: your own experience with the brand—including any and all possible grammar and spelling mistakes.
Usability.gov also writes about building credibility here and explains that users are attuned to quickly spotting factors that may erode their trust. Interestingly enough, their findings have revealed that younger users (age 28 and below) tend to be harsher judges of typos, broken links, and other similar errors.
But is bad grammar really that bad for sales? The psychological research evidence can be conflicting.
On the one hand, BJ Fogg proposes a behavior model in which customers’ actions (such as purchases) are influenced by motivation, triggers, and ability. If it’s easy for them to make the purchase and they want the product badly enough, they’re not going to be put off by some bad spelling here and there.
On the other hand, trust is hard to earn and important to keep. One crime rate analysis theory called The Broken Windows theory explains that people will tolerate small nuisances (broken windows, unkempt yards, spelling mistakes), but they’ll also wonder if they might actually point to a bigger red flag. Let’s apply this to ecommerce: your clients might not flee in horror when they spot 2 or 3 spelling mistakes. But they might wonder if your order fulfilment process is just as sloppy.
Furthermore, this paper explains that trust, together with perceived risk and perceived benefits, has more impact on the loyalty of online customers than does satisfaction. To illustrate, think of Old Navy and Topshop. The former brand ran a jersey line that featured the misspelled slogan “Lets go [Team Name]”, while the latter managed to spell Shakespeare’s name wrong, on what was supposed to be a tribute t-shirt.
Both brands recalled the poorly spelled apparel, which likely cost them quite a few thousand dollars. And, to add insult to injury, they also lost valuable word-of-mouth marketing. This continues to snowball to this day, since what goes on the Web stays on the Web, as Adrian Snood writes for Social Media Today.
Though split tests on the effect of bad spelling on conversions don’t exactly abound, there is a growing body of correlative evidence. Here are some relevant poll and survey results from recent years:
Predictably enough, the empirical research that examines online brand presence credibility correlated with spelling and grammar accuracy has confirmed the above. The short of it: users really dislike bad grammar.
In addition to conversion, spelling and grammar are, indeed, very important for SEO. Google wants to provide a good experience for users, so it makes sense that spelling and grammar play a role in SEO. To verify this, look no further than Google’s own Webmaster Central blog. In this article, they encourage webmasters to assess the quality of their pages and posts. Some questions the algorithms can allegedly answer include:
– Are there style, factual, and/or spelling mistakes in the content?
– Has the content been checked for quality?
– Does the content appear to have been edited, or does it look sloppy?
– How much attention to detail has been invested in producing the article?
And, if that’s not enough, check out what Matt Cutts has to say about it: pages with high PageRank, i.e. “reputable sites tend to spell better, and the sites that have lower PageRank, or very low PageRank, tend not to spell as well”.
The Senior Product Manager at Bing confirmed this approach in 2014. Duane Forrester then explained in an interview that search engines are judged based on the quality of the content they deliver. So, then, why would any major search engine run the risk of displaying poorly spelled, ungrammatical content?
If, not that long ago, SEO ‘experts’ were optimizing for misspellings on purpose, nowadays Google runs two algorithms that discount for such human searcher mistakes. One is ‘did you mean?’ and the other is ‘showing results for [presumed correct spelling]’.
There are also several signals that point to the usefulness of removing user-sourced spelling and grammar mistakes.
So far, we’ve established that bad copy, peppered with spelling mistakes and grammar errors can hurt sales. However, it doesn’t necessarily always do so. To help drive home the importance of context and message matching, let’s take a look at some dos and don’ts for writing in context.
Write for your customers. Some of them will actually find it easier to connect with an informal, casual style. They will like contractions, abbreviations, and slang and unless they are academics, they’ll likely tolerate dangling prepositions and one-sentence paragraphs.
All too often, an ecommerce website will be leaking money from tiny cracks in its veneers. To help avoid them, you can:
o Implement a top-down page performance review system. Check out your best and worst performing pages each month. Make sure the copy is error-free, the images are sharp, and there’s no time reference that can date the content (e.g. ‘as seen on 2008 runways’ on a fast fashion eStore).
o 3. Do test, test, test. Forget HIPPOs. Test for even the smallest of changes, instead of using the “throw a ‘bowl of pasta at the wall to see what sticks’ approach. A/B test results are your best bet at improving conversions.
At WhichTestWon, our goal is to provide you with interesting, objective data about testing and conversion rate optimization.
We just did something really cool for you. We looked all the Tests of the Week published over the year (between Jan. 1 2015-Jan. 1 2016), and we analyzed them based on five factors:
Our findings are very interesting and we want to share them with you!
In this short report, we tell you where testers were focused, what software they used, and how they designed their tests*.
*It’s important to note that WhichTestWon publishes only the best tests submitted to us. We receive many entries, but selectively choose only those tests that meet our rigorous and stringent testing standards.
The data compiled below is a summary of the best of the best tests of 2015.
Here’s what we found:
Looking at our 2015 Test of the Week entries by geographic location, you’ll see the lion’s share (39%) of tests came from companies or agencies headquartered in the US.
UK (26%) and Europe (20%) were the next biggest countries submitting tests.Australia and New Zealand comprised a small slice (7%) of published Test of the Week case studies.Unfortunately, only a small percentage (4%) of published tests came from Asia. In the future, we’d love to see more Asian studies! If you’re in Asia and running tests, please hit us up andsubmit your testsfor publication.Other countries, like India, Israel, and Canada made up the remaining small percentage (4%) of countries submitting tests.In 2016, we hope to see an even more diversified collection of tests from, literally, all over the world!You can submit your studies anytime, anywhere through this easy-to-fill-out entry form.
To run a reliable, valid test, the software you use can make or break your study.
The majority of our published Test of the Week entries used one of three testing platforms.
Trailing further behind, 9% of testers used Google Content Experiments.
Interestingly, however, a significant chunk (15%) applied their own proprietary software or testing technologies. Creating proprietary testing tools seems to be a growing market. We anticipate this number will rise in 2016.
22% turned to ‘other’ tools and technologies to run their tests. What comprised ‘other’? Read on.
Platforms like Marketo, HubSpot and Bounce Exchange accounted for a big chunk of the ‘other’ software providers used.We anticipate these digital marketing-focused platforms will continue gaining traction with testers in the coming years.
Geographic location definitely influenced the type of software used. Certain countries exhibited a strong preference for particular testing tools.
In Europe, the strong majority (56%) of testers used Visual Website Optimizer.
However, in the UK, Optimizely dominated the market, with 55% of testers using this competing tool.
In the US, Adobe Target (22%) was the most used software. In the States, VWO and Optimizely were equally popular; they tied at 11%.
In Asia and the Pacific region, there was no clear trend on testing software used. Likely because there were too few tests submitted to draw obvious conclusions.
However, interestingly, in India, our limited sample showed proprietary testing technology was the way to go. Of our Indian studies published, 100% used proprietary tools to run their tests.
In terms of test design, this year, the majority of published studies (13%) were focused either on button color/copy or were radical redesigns.
This finding is interesting because both button studies and radical redesigns tend to be viewed as some of the least optimal tests to run.
However, it’s important to note – we don’t choose the tests you run – we just publish them.
Despite our rigorous submission criteria, we can only publish the types of tests you submit to us. If many testers run – and submit – button tests and radical redesigns, these are the studies that will be featured as our Tests of the Week.
At WhichTestWon, we definitely feel there’s a need for testers to push the bounds and run more sophisticated tests. Our State of Online Testing Report addresses this finding in more depth.
Additionally, studies featuring content offers, or those looking at navigation features, and using overlays to convert visitors were also heavily tested (9% each).
Very few tests looked at personalization or audience segmentation. But, we do hope to see this trend reverse in the coming year as testers and testing technologies become more sophisticated.
In terms of the focus of our published tests, the majority (24%) looked at how to increase click-through rates. Although click-through rates are important, in the future, we hope to see more testers going further down the conversion funnel.
Final conversion funnel metrics like total sales, average order value, and revenue per visitor are highly important for a company’s bottom line. A big lift at the bottom of the funnel proves the value of testing and bodes well for your job.
This year, 9% of published tests focused their conversion metrics on sales. And, 5% looked at Revenue Per Visitor (RPV) or reservations/bookings.
As the testing industry matures, we hope to see more sophisticated tests looking further down the funnel.
In 2016, we look forward to seeing your submissions for upcoming Tests of the Week.
To submit your study for potential publication, simply click on this link and follow the easy steps to fill-out our submission form.
Thanks for being part of WhichTestWon. Best wishes for a very happy New Year!
Have you ever debated what should be included in your site’s header? This week’s header test by Harry & David’s ecommerce store increased purchases by a double digit amount. See if you can guess which header did the trick…
There were seven total elements tested: search box elements such as outline, highlight, and color; search term copy and location; and the look of the header’s cart icon.
See the creative samples and guess which one won. Let us know your thoughts in the comments.
Landing page content is always a balancing act of not enough information vs. too much copy clutter. See whether longer or shorter copy increased sales by 62% on this landing page.
This week’s case study is a clean A/B test – the versions are identical except one omits a FAQ section that had appeared below the fold. Note: Click twice on each images to blow them up to full size.
Site navigation is something every website should be testing. This week’s Ecommerce Product Page test has some interesting results that should remind you to always be testing.
This is a clean A/B test juxtaposing two product pages that are 100% identical except one version includes site navigation in the left hand margin and the other version does not.
Can you guess WhichTestWon? Let us know your thoughts in the comments.
Have you ever wondered how important trust icons are on your lead generation forms? We’ve wondered the same thing! Can you guess which form got the most submissions?
This is a straight forward A/B test pitting a short form with no trust indicator against one with the TRUSTe insignia. This was the ONLY element that changed on the page.
Does a clean entrance page or additional site navigation increase leads for Dell’s cloud computing solution? Can you guess which test won in this 2012 award-winning test?
This section of Dell’s site has over 60+ pages of content to digest. Dell hypothesized that the additional nav would increase leads,and ran this test in order to find out if it would lift conversions or add too much clutter.
What do you think? Did this additional nav increase leads or confusion. Vote and tell us what you think:
This week’s test showcases a subscription offer page test by social media giant LinkedIn. Can you guess which of the radically redesigned pages increased subscription purchases by 20%?
There are many differences between the two pages. The most notable is one page uses dynamic content to display specific call-outs based on user info & site location, and the other uses static content while maintaining the familiar LinkedIn UI.
This week’s test is a must-view for all sites getting traffic from more than one country. If you market globally (or want to someday) see if you can guess which homepage won the test:
The pages were identical except for the addition of a dynamic text call-out targeted at international visitors. The country’s name was displayed based on the visitor’s geolocation.
Have you run an HTML vs. plain text email test lately? With the winning version here increasing revenue by 303.8% you may want to soon…. Can you guess which version it is?
Both email versions shared an identical subject line, landing page, and offer, and both were sent to segments of the same list.