Online consumer behaviour and expectations of their online experience change continuously with the introduction of new technology and new trends. To allow your business to adapt along with your customers, UX & CRO needs to be an integral part of your ongoing online strategy.
Its value doesn’t stop at making tweaks to the design of your web pages however; UX & CRO can fuel your SEO and content strategy, it can reduce paid search budget wastages, it can even spark an innovation in how you do business.
Encouragingly, the emphasis placed on UX & CRO amongst online businesses is growing, but there are still some common misconceptions that the industry faces which can undervalue a UX & CRO strategy. Let’s take a look at a few:
“Nobody is clicking on the CTA, let’s change it to green”
It is a common misconception that changing a CTA to green will generate a higher click rate and inevitably increase conversions; but this isn’t necessarily true.
Changing the colour of a CTA will cause users to perceive and interact differently than they did with the original, but it’s important to consider brand guidelines as part of this decision and ensure that the colour contrast meets accessibility guidelines.
It isn’t always the colour of the CTA that needs to change to increase engagement, we need to ask the following before jumping straight into changing the button colour:
Is the CTA visible and does it stand out enough? Does the user have enough information to push them to click the CTA? Does the copy set the user’s expectations around where they will land after clicking? Does the shape and style of the CTA make it look clickable?
“Overall conversion rate hasn’t increased, CRO doesn’t work for us”
After investing in CRO for several months, we want to see some ROI improvements, but looking at overall site conversion rate will not always provide this.
Overall conversion rate is affected by many factors which make it inaccurate to measure CRO performance by looking only at overall conversion rate.
An increase in PPC spend could push more convertible users to the site, whereas a flux in organic traffic from a blog post could dilute conversion rate.
To really understand how CRO testing and implementation has made improvements, we need to be more granular with the data.
If we have run a series of tests on the basket page, then we should look specifically at basket page performance, particularly exit rates, progression into the checkout as well as basket page conversion rate.
What often happens is basket page improvements push more users into the checkout, but users are still abandoning the checkout due to a poor experience, meaning end conversion rate has stayed the same and is not the metric we should use to measure success here.
What we can do is learn from these results and investigate ways of optimising the checkout process.
“The campaign is not strategic enough”
The answer to this one is simple; set some goals.
It’s impossible to solely focus on end conversion rate, we need smaller goals along the way to help us get to where we want to be, for example increasing AOV with upsells, increasing product to basket progression, or reducing homepage bounce rate etc.
We also need to consider wider business objectives, such as minimising the volume of returned products, or increasing the amount of calls that lead to sales.
It’s important that these KPIs and objectives are communicated throughout all testing documentation, so it’s clear how CRO is making an impact on wider business goals.
“We don’t have time to AB test, let’s just implement these changes”
The reason we AB test almost everything is because we don’t know whether or not it will work. The worst thing that we could do is recommend implementing something purely because we think it will perform better, or we think it will look better.
Everyone has opinions, but the most important opinion is that of the users and what the data tells us, and we won’t know this until we test something out for them first.
Time does need to be invested in AB testing. It’s important to run tests for a sufficient amount of time to ensure enough traffic enters the test for accurate and statistically significant results.
It’s also important to spend time doing the results analysis, particularly if a test has had a negative result and we need to find out why this is the case, and what we can do instead.
“I don’t have time to implement any CRO recommendations”
Despite showing increased user engagement, more conversions and often more revenue; a lot of successful AB test changes are put to the bottom of the web development queue with low priority, which means they will not benefit from the incremental value of AB testing.
We often get asked to use the testing tool to push the winning variation to 100% of traffic. Although this is technically possible, it’s not a long term solution.
Rather than using the testing tool to push changes to the site, we should utilise positive test results as a business case, to persuade stakeholders or decision-makers to increase the priority of these changes.
“We want changes and results, but you can’t touch this page, or that page.”
The pages that we AB test, and the changes that we make, are driven by the findings from qualitative and quantitative data research. If a test is proposed on the homepage, it’s because the homepage has proved problematic during the insight phase and there is opportunity to be gained by testing here.
We’re frequently told that certain pages are off limits, mainly due to internal reasons, however this hinders making improvements and slows down results.
Testing on pages that are precious from an internal politics point of view can be extremely beneficial, as it helps mitigate risks, provides factual insights to settle internal debates, and challenges us to create a culture of experimentation and innovation.
Taking your UX & CRO strategy forwards
A UX & CRO strategy can make big differences to a business’ bottom line, and it’s no longer a channel that should be viewed in silo, but in-line with your overarching online strategy alongside organic search and paid media.