What is A/B testing?
AB testing is a methodology where two versions of an app or a webpage are compared against each other to understand which performs better. The approach works by showing two or plus variations of the page or screen to real users randomly and then applying statistical analysis to determine which of the variations performed best against the predetermined conversion goals.
A/B testing or split testing takes guesswork out of the picture by enabling data-backed decisions that move businesses from “we think” to “we know”. Furthermore, the data points can be used to understand users behavior, increase the engagement rate, add new features, and revamp the screen or page.
Why A/B testing?
The approach enables teams and companies to make user prompted changes to their web pages and apps, while getting clarity on how individual elements impact user experiences.
The insights that A/B testing results provide usually creates an impact that goes beyond a satisfied user base. Let us look at some of those other critical benefits.
- Solve pain points
Every visitor that comes to your website or app is looking to achieve a goal, which can be to know more about the product, make a purchase, or simply browse before investing. Whatever the objection could be, they might face some roadblocks in the journey to achieving it.
Running A/B tests on multiple versions of the in-platform user journey can help bring those issues on the surface and enable businesses to solve them proactively.
- Lower bounce rate
One of the key metrics to judge an app or website performance is the bounce rate. There could be a number of reasons behind people bouncing off from your page without going through the intended journey, such as confusing navigation, technical jargons, too many options, etc. While there is no one size fits all solution for high bounce rate, you can test multiple variations through A/B test and pinpoint the friction elements in the user journey.
- Make low-level changes
Redesigning pages can be expensive and resource intensive. Moreover, the lag that a work in progress website or mobile app causes often leads to lowered conversion rates and negative reviews.
A/B tests help you approach redesign in a section by section mode, following which you can analyze users’ reaction and finalize the best option before they reach the stage where conversion gets affected.
- Improved ROI from existing traffic
The cost of getting new users’ attention is far greater than retaining the current traffic. AB testing enables you to get the maximum returns on the existing traffic by making changes in the journey that is keeping them from helping you achieve your core business goals.
What should you do for the A/B test?
Every piece that reaches your target audience should be tested to ensure that they lead them to your goal. Now while there are some specific areas that should be tested against their conversion capabilities, you should test every element of your application or website since you can never really tell what would make the users click.
As a starting point, however, test the following elements under your optimization plan.
Copy
- Headlines
- Body content
- Subject lines
- Content length
Layout and design
- Combination of branding, conversion, and information
- Placement of CTAs
- Colour, font, and icon choice
- Visual representation of sections
Navigation
Forms
- Length
- Field options
- Placement
CTAs
- Images
- Color choice
- Text style and size
- Content
Social links
Different types of A/B tests
After gathering an understanding of which elements to test, it helps to know the different test methods. Applying the best method can help your business get maximum returns out of your invested efforts.
Split URL testing
The testing approach is one where an existing URL is tested against a new version of the same web page URL by splitting the website traffic between the main page and its variation. As opposed to A/B testing where only frontend of the page or screen is tested, in case of split URL testing, the entire web page gets redesigned and rebuilt for comparison purposes.
Multivariate testing
In this test model, multiple variations of more than one web page or app screen variables are made to analyze which combination of those variables work best for the product. Since it is otherwise time-taking to design and deploy individual elements’ variations, the outcome of this approach is usually faster, cost effective decision making.
Multipage testing
In this case, the elements which are common across multiple pages are tested through their different versions. You can also take it a step further by testing the addition or removal of recurring elements like testimonials, badges, CTAs etc.
How does AB testing work?
Running A/B tests on your webpage or app screens can be challenging for first-time optimizers. Here’s the framework that is typically adopted when performing the different types of A/B tests.
- Collect data
The first step in performing A/B tests lies in understanding what needs to be optimized in the website or application. It helps to start the analysis process with high traffic pages, screens with high bounce rate, and core conversion pages to plan out the conversion rate optimization efforts.
Some of the tools that can come handy at this stage are Google Analytics, Mixpanel, Heatmap tools, and Session recording platforms.
- Ideate test hypothesis
Once you have found the problem pages, you can start generating A/B test ideas which you think can improve the current versions. The ideas, then, should be prioritized according to the impact they would create and the implementation complications.
- Create multiple variations
After you have prioritized the hypotheses, the next step that comes is creating variations based on them. The variations can include multiple things – new design, layout changes, removal or addition of recurring elements, etc.
- Run experiment
With the different variations now ready, you should plan out the duration in which the tests will run, the expected outcome, number of variations, and the percent of visitors who will be included in the test. Next, run the test and save the results.
- Analyze results
Once the test period is over, it is time to analyze the results based on the generated data. The analysis can help you get ascertained of which variation worked for your website or app screen. What is important at this stage is to have a smart representation of the results for sharing with the execution or decision-making teams.
What are the A/B testing goals?
What you want to measure from your A/B test would depend heavily on your business goals and the industry-level metrics. Here are some examples of industry-specific split testing goals you can measure your digital product against.
eCommerce
- Page views
- Products added to the cart
- Completed purchase
Lead generation
- Clickthrough
- Social shares
- Form submission
Media
- Article read
- Subscriptions
- Comments
- Share
Customer support
- Issues resolved
- Support material downloads
- Customer satisfaction surveys filled
Blogs
- Page views
- Comments
- Link clicks
- Shares
- Form fill ups
While we just saw how the split testing goals vary according to the domain you are operating in, there are some common objectives that should be considered in your AB testing strategy.
- Page visits
- Links clicked
- CTAs clicked
- Form submission
- Revenue goal
- Scroll percentage
- Traffic source
What are the A/B testing mistakes to avoid?
A/B tests are a sure shot way to ensure that your business goals are executed in the right direction. Noting the time and resources that go into running the split testing procedures, it is critical that you don’t make any mistakes that would affect the process.
Here’s a list of some of the common mistakes businesses make when performing AB tests. Knowing what they are will help you avoid them from the start.
Not planning the optimization roadmap
The biggest mistake businesses make is basing the optimization hypothesis on what their competitors have done instead of looking into their own analytics. When you start with the wrong hypothesis, the probability of your test results failing automatically becomes higher.
Testing multiple elements in one time
Building multiple variations of different website or app elements and testing them all together can lead to a situation where you don’t know what is accepted by the users and what isn’t. Running tests on specific elements would help you decide the performance of the variations better.
Having incorrect duration
Based upon your goals, AB tests should run for a specific timeline. When you run it too short or too long, the chances of failure increases, thus it is important to fix a timeline and then stick to it even when you see a positive change in numbers in the initial few days of the variation launch.
Not considering external factors
Split testing should be performed in time periods where the traffic is both low and high, for example the duration should have days in which traffic is the highest and even low-traffic days like vacations and holidays. This way, the conclusion of the test will be a lot more significant.
Real world examples of A/B (split testing)
A/B testing approach and outcome varies from company to company, however what is common across all the domains is the improved customer experience. Here are the examples of three brands that witnessed success after they ran split testing.
Nissan
Nissan, after facing a decline in in-person interactions, wanted a deeper understanding of its target audience. The approach they followed was running AB tests and multivariate testing to see the impact of design elements and content. By the end of the test, the brand saw a decline in bounce rate and improved conversion rates.
BBVA bank
The bank wanted to optimize their web and mobile experience, for which they created more than 1,000 split tests of the user experience design. By the time they finished executing the test findings, they increased the customer base by over 20%. Moreover, BBVA today has a 50/50 split between online and offline customers.
Swisscom
The telecom company wanted to meet their customers’ needs better and become more competitive. To aid that, they tested content, banners to build real-time experiences and measure customer behavior. At the back of A/B testing, they found that their customers appreciate personalization – an insight that led to the brand seeing a 40% rise in new users.
Conclusion
Performing A/B testing can help you not just improve your conversions but also remain relevant in the fast-paced digital environment. The different facets of split testing that have been covered here will help you plan out and deploy a well-strategized test model that can be used across different business goals and domains.
What is critical is to understand that it is not a one-time process and businesses should keep on using the past test results to build the next test strategy. Only when followed iteratively, the model can help you keep the website or app’s performance in control.