At Endless Gain, we analyse all our A/B tests using Google Analytics, where appropriate. This not only helps us verify that the test data is recorded correctly, but also allows us to get additional insights, as the reporting features of testing platforms have some limitations.
We work with various platforms, including Qubit, Monetate, Sitegainer, Google Optimize, VWO, AB Tasty, and Optimizely.
Most of those tools have some kind of integration with Google Analytics (GA) and we make sure that it is enabled for every test we run. Of course, we’ll need a whole different article to talk about integrating the platforms and getting them to work coherently!
Before I move on to tips on analysing A/B test data in Google Analytics, let me give you a brief idea on why we primarily use Analytics instead of the reports generated by testing platforms themselves.
Why Use Google Analytics Instead of Testing Platforms for Reporting
- Flexibility to add more business goals: Typically, A/B testing platforms are limited in their flexibility to add additional goals and track them. On GA, we can add more goals, and look at their results comparatively.
- Easy to verify and integrate with existing reports: If you already have Google Analytics and use it to create business and goal reports, you can integrate the A/B testing results with those reports. If you’re creating reports from another tool, you’ll have to manually add those metrics to your common reporting system. GA being more accurate in their tracking, also allows you to verify the test results.
- Easy to share: It’s difficult to share the reports from some of the platforms, but Analytics makes it quite easy.
- Easy segmentation of data: The reports generated by some testing platforms aren’t well segmented and depend mostly on filtering. Most also don’t allow comparisons. GA, on the other hand, allows you to flexibly segment data, which opens endless options—segmentation by device, traffic sources, interactions with other tests or website elements, funnel analysis, etc. This also gives you the opportunity to really dig into the behaviour of the experience. Comparison between segments is also very easy in Google Analytics.
This doesn’t mean that you shouldn’t use the A/B testing platform’s reporting at all. It’s definitely easier and more convenient to check the live test performance using the testing platform. However, using Google Analytics for the end-of-test reporting is WAY better as it gives you more options and segmented data.
4 Ways to Make A/B Testing Data Analysis and Reporting on GA Easier [with Screenshots]
So, what should you keep in mind when analysing your test data on Google Analytics? Here are some tips to get you started. Once the basics are clear to you, you should be able to find several advanced options that you can use to get all the data relevant to your business.
1. Create segments to understand your users better
By adding segments (or conditions) to your reporting, you’ll get the right kind of data you need. You can see results based on which pages get the most traffic, on traffic sources, demographics, visitor type, location, browser and device, etc. You can also add specific filters for inclusion and exclusion of data types.
The majority of A/B testing platforms only send event information which shows test variations. In GA you can find the event details in the Top Events report under Behaviour. This is where you can create custom segments.
If you’re using VWO and your integration is through Universal Analytics, then you’ll need to set up segments using the correct custom dimension.
If you’re using Google Optimize, you’ll have to look for Experiment ID under the Details tab.
When you create segments, choose Experiment ID with Variant and add :0 for control and :1 for variation.
2. Create custom reports
We use a custom report that allows us to see the test data split by device. If you want to add any custom calculations, you can download the report as an Excel file and filter the data there.
If you would like to use it here is a link to the template.
3. Monitor test performance using Google Sheets
You can also monitor your test performance using Google Sheets. Log in to Google Sheets using the same credentials as for Google Analytics. Create a new spreadsheet and in the top menu, go to Add-ons > Get add-ons. A new pop-up will open. Select Google Analytics from the list of apps. Install it.
Go back to Add-ons. Google Analytics will show up as one of the options now. Click Create new report.
A new pop-up will open. Name your report, select the Account, Property, and View, and add metrics. I use Sessions, Users, Transactions, and Revenue. You can also add a Dimension (I’ve selected Device Category). Click on Create Report.
A new tab – Report Configuration – will open. Here, make sure you have one column for the Control and one for the Variation. Remember, the report names (see row 2 in the image below) must be different. Select your test dates (in US date formatting—MM/DD/YYYY) and add your segment details.
Once you finish, go to Add-ons > Google Analytics > Run Reports. This will create new tabs—one for each column in the report configuration. Always check that the report contains sampled data.
You can now add a new tab to pull all the test data into one sheet and add the calculations you need to analyse the test:
Once this is done, you can share the report with your team directly from the browser, using the share button.
You can also schedule the report to be updated daily. Select =today()-1 as end date. Go to Add-ons and select Schedule Reports.
4. Run detailed reports at the end of test
Don’t get hyper-focused on one dimension – for example, whether the conversion rate uplifts differ from device to device. You can use secondary dimensions to see the differences for new/returning users, different traffic sources, or whatever additional data will help your test.
You can also download daily performance data.
And if you run an e-commerce site and notice a change in average order value (AOV), you can download sales performance data for control and variation and check the order value distribution, check for any outliers, and run significance calculator.