Case Study: Solving The Discrepancy Between A/B Testing Tool, Google Analytics and Backend Data For a Large E-Commerce Business

Data discrepancy

With the help of Reflective Data’s custom data pipeline, BigQuery and improved analytics integrations, we managed to bring the discrepancy between different tools reporting on similar data from ~35% down to ~2.5%.

About the client

Our client, in this case study, is a large global e-commerce website, headquartered in the US.

Their main focus is on men’s clothing and they’re specialising in dress shirts, as well as ties, suits, casualwear, shoes and accessories.

Depending on a season, their monthly online revenue is between 10 and 15 million USD. This comes from around 45k monthly transactions.

Due to NDAs, we are not allowed to disclose the name of the client. Everything else in this case study remains unchanged.

The challenge

Working with skewed data can be worse than having no data at all. This is why we’re always promoting all sorts of analytics audits and making sure all data sources agree with each other. At the very least, you should know why the numbers in different tools don’t match (i.e. analytics doesn’t include offline sales but backend does).

Our client in this case study contacted us with a quite specific problem. They were running a decent CRO program with 4-6 A/B experiments running every month. The problem they had with the program, though, was that the numbers they saw in their testing tool Optimizely, Google Analytics and backend didn’t match. In fact, there was a ~35% discrepancy overall.

While it is almost impossible to reach 100% accuracy with Javascript based tools due to ad-blockers, cookie deletion etc. and some tools have different logic for calculation metrics like conversions, sessions, users — we knew that this discrepancy is way off the limits and immediate action is needed.

This kind of discrepancy was even more concerning as the client was actively running their CRO program. It wasn’t rare for them to see test results flip based on where they got their numbers (Google Analytics vs Optimizely). This put their entire CRO effort under question internally and the management wasn’t far from shutting it down. The results just weren’t trustworthy enough.

The solution

Even though the client described some rather clear symptoms, it is necessary to always start with a proper diagnosis of the problem. This way we can be sure that there indeed is a discrepancy between all these tools – in some cases, the problem can be in how you understand/read the numbers. Another goal for the diagnosis was to understand the size/reach of the problem.


The very first thing we did to diagnose and measure the discrepancy was to get access to all data sources. That included the testing tool (Optimizely), Google Analytics and the client’s backend numbers. Next, we made sure the metrics are comparable (i.e. all the transactions we see in the backend are indeed supposed to show up in Google Analytics as well). We then put those numbers side-by-side in a spreadsheet and grouped them by date.

After analyzing the daily numbers, we came to the conclusion that there is indeed an average of 35% discrepancy between Optimizely and Google Analytics (less transactions in the testing tool). The discrepancy between Google Analytics and backend, on the other hand, was rather small — just below 3%. This can be considered OK for a Javascript-based analytics tool.

The discrepancy between Google Analytics and Optimizely meant we had to dig deeper on that side. Unfortunately, Optimizely doesn’t give access to the raw underlying transaction ID level data by default. Luckily, though, we were able to get this data from their support team. With Google Analytics, exporting transaction-level data was doable with a custom report and CSV export. For the Google Analytics dataset, we included dimensions that would give us some extra context — payment gateway, products, device category, country etc.

After pulling transaction-ID-level data from Optimizely and Google Analytics in spreadsheets, we wrote a quick function that would mark the transactions in Google Analytics that didn’t show up in Optimizely. Then it was time to start looking for some patterns. Is there something in common for all missing transactions?

This is how one version of this spreadsheet looked like. This was to highlight transactions in the backend that didn’t show up in Google Analytics.

Google Analytics vs Backend
Google Analytics vs Backend

Implementing the fixes

Since the website and payment processing system for this website were rather complex, it turned out there are more than one such patterns. They were all related to the use of various payment gateways, device categories and local sub-sites for specific countries.

In order to come up with a proper solution, we had to dig even deeper into how transaction tracking was set up for both Google Analytics and Optimizely. Luckily, we were able to detect the shortcomings quite fast and fixing them wasn’t rocket science either. After a few rounds of testing and a couple of A/A tests run in Optimizely, we were able to get the discrepancy number from 35% down to 2.5%.

The main reason for this mismatch was that while Google Tag Manager was in use to unify transaction tracking for Google Analytics, tracking snippets for Optimizely were all over the place. The solution involved moving all Optimizely tracking into Google Tag Manager and use the same triggers that were being used for Google Analytics.

Going forward

Since the site has a rather complicated structure with multiple payment gateways and different pages for different countries/markets, we wanted to make sure that after implementing the fixes, the discrepancy would not start sneaking in again.

The solution here was a proper monitoring system that would get data from Google Analytics, Optimizely and the backend. Since the client was already on Google Cloud, we recommended using BigQuery as their data warehouse technology. Besides being really fast and cost-effective, BigQuery has some of the best integrations which makes implementing and using it as a marketing data warehouse much easier.

Getting data into BigQuery

For Google Analytics data, we used a technology called Parallel Tracking. This allowed us to send raw hit-level unsampled Google Analytics into BigQuery. In real-time and for much cheaper than using Google Analytics 360.

To get data from Optimizely into BigQuery, we built a custom connector using Optimizely API. This solution allowed us to pull only the information that was necessary and make sure it’s always up to date.

Since the client had a custom backend infrastructure, we built a custom connector between that and BigQuery as well.

All connectors are now running on Google Cloud.

Reporting and alerts

In order for the client to be able to check the numbers from all three sources in one place, we built a dashboard using Google Data Studio which has a native connector with BigQuery.

To detect any discrepancies as soon as possible, we also created an automated reporting system with some custom code on Google Cloud Platform. This will send an email to relevant parties every time a certain threshold is passed. So far, it has already detected a few issues the client was able to fix before larger damage was done.


Overall, we were quite happy with the results of this project but I believe it’s better to end with a quote from the client.

We hadn’t worked with Reflective Data before but they were recommended by our friends in the industry. We needed a quick but reliable solution to our problem. Our management was questioning the results of our CRO program and we weren’t far from getting shut down.

Team at Reflective Data was able to immediately understand our problem and jumped right into solving it. I think all together it took just around 2 weeks for them to diagnose and together with out IT team fix the problems behind this rather large discrepancy.

After great success with our first project together, we’ve continued working with them on various data-related projects around our data warehouse, reporting and dashboards.

Director of e-commerce

We are so glad about these words from our client and it’s moments like this that give meaning to our work.  By the way, their CRO program has only grown since we were able to fix the mismatch. So, in a way, we were part of saving the jobs of an entire team.

Should you have similar problems with discrepancy or other data related requirements, please contact us and I’m quite sure we can work together and find a solution.

2 thoughts on “Case Study: Solving The Discrepancy Between A/B Testing Tool, Google Analytics and Backend Data For a Large E-Commerce Business

  1. Hi, I’ve been googling left and right for a strange problem I’m having and found your article. Have you ever encountered a situation where GA data has discrepancies between different chart types? I have a bar graph where it’s showing one set of values, but then when I toggle to a table chart, the values change and are different. I’ve tested this many different ways (including going into GA itself), and I’m pretty sure the bar graph data is incorrect. Obviously I’m not changing any of the data settings (source, dimension, metric, date, etc). Have you ever seen something like this before? Thanks if you have any insights or have anyone whom you can point me to who might know the answer.

Leave a Reply

Your email address will not be published. Required fields are marked *

Sign up for Reflective Data

5,000 sessions / month ×

Please enter your email


After submitting your email, we will send you the registration form.