So, here’s how marketers can help and, in the process, become more connected to and driven by data:
1. Marketers should help define KPIs so they can connect their media and crm campaigns to the entire funnel, including conversion success
2. Marketers should be the leading source of hypotheses for testing and analysis. These hypotheses should come from their sense of brand, the market, design and customer relationships.
3. Marketers should help prioritize the hypotheses based on high level goals and themes they are pursuing for the brand and business.
4. Marketers should work with the analysts to help articulate and present the insights back to the business so that change is most likely to be adopted.
5. Marketers should become great customers for their analysts, defining requirements at the outset and providing context that might not be apparent, including events and promotions that could taint data.
In short, marketers should increasingly think of themselves as optimizers.
All of these are 100% true! But, that’s a focus on how the analyst should develop their own skills, and this post is more of a process-oriented one.
Part 1: “I believe…[some idea]”
Part 2: “If I am right, we will…[take some action]“
This construct does a couple of things:
Analysts can provide a lot of value by setting up automated (or near-automated) performance measurement dashboards and reports. These are recurring (hypothesis testing is not — once you test a hypothesis, you don’t need to keep retesting it unless you make some change that makes sense to do so).
irst, there is how the request should be structured — the information I try to grab as the request comes in:
he next step is to actually assess the request. This is the sort of thing, generally, an analyst needs to do, and it covers two main areas:
If the analytics and optimization organization is framed across these three main types of services, then conscious investment decisions can be made:
With a process in place to capture all three types of efforts in a discrete and trackable way enables reporting back out on the value delivered by the organization:
Even though every A/B test is unique, certain elements are usually tested:
|A prerequisite for making any of this work is having a statistically significant number of visits to your site on a daily basis, right?|
I think many people first have to figure out how to cross that bridge before they start to worry about optimizing what's on the other side.
I’d been trying not to overthink the problem (I often get called a stats wonk) and simply to trust the tools at my disposal and follow common practices:
It was only after getting some strange results and digging deeper into the statistics that I discovered how dangerous this was.
We are seeking a 95% confidence level. What I found was that even if you are:
As many as one in five of your “successful” results may in fact come from having accidentally (randomly) sent more high-converting (“email”) traffic to one variant or the other. (This explains why people sometimes find “successful” results when they are actually comparing two identical pages).
The glimpse of light at the end of the tunnel is that the longer you run a test for, the more the channel distribution converges (by the law of large numbers) to be the same for each variant. This means that we can fix the problem for running our tests for longer.
if you want to embed a testing culture in your organization, you’re looking for quick wins and a demonstrated ability to get results. That means adopting a ‘lean’ approach, getting quick validation of the concept, rather than building a large, complex structure before a single test has taken place.
Google’s New Content Experiments API Turns Google Analytics Into A Full-Blown A/B Testing Platform | TechCrunchtechcrunch.com
So, comparing A/B testing and multi-armed bandit algorithms head to head is wrong because they are clearly meant for different purposes. A/B testing is meant for strict experiments where focus is on statistical significance, whereas multi-armed bandit algorithms are meant for continuous optimization where focus is on maintaining higher average conversion rate.