Overview of Website Optimizer Reports
Once an experiment begins running on your pages, Website Optimizer will show the data being collected in your reports. The statistics available in Website Optimizer reports vary depending on the type of experiment you're running. If you're running an A/B experiment, you'll see conversion data for each alternate page variation that you're trying out, along with the performance of the original page. If you're running a multivariate experiment, you'll see conversion data for each combination as well as each page section overall.
When viewing reports, remember that Website Optimizer tests all traffic to a page, rather than just traffic from AdWords advertisements. Also, you can set the percentage of your site traffic to include in an experiment, limiting how many users see test content.
The conversion data in your Website Optimizer reports may not match traffic recorded by web analytics tools. While analytics tools record all traffic to your pages without exception, Website Optimizer reports the specific scenario being tested in your experiment, which may include only a fraction of your site traffic. Another likely cause of discrepancies is differing session lengths. Sessions aggregate activity by a visitor over a length of time, so that stats reflect the activity of a single visitor. Website Optimizer uses a more extended session length than analytics tools, so it may record a different number of conversions for each visitor.
Unless your test page receives very little traffic, you should see statistics show up within a few hours of when you start the experiment. If you're not seeing data show up within a few hours, it's likely the tags were not correctly implemented on one or more of the pages you're testing.
About Website Optimizer Combinations and Variations Reports
The same data is reported in the variations report, which is displayed for A/B experiments, and the combinations report, which appears for multivariate experiments. For A/B tests, information is collected for each alternate page variation; for multivariate tests, it is collected for each combination. These are the columns you'll see for the variations or combinations reports:
Estimated conversion rate provides the most immediate insight into overall performance. This column shows how well each combination or variation is performing relative to your original content. It displays the numerical data, as well as a colored performance bar for each combination to visually indicate its estimated conversion rate range. As your experiment progresses, you'll notice that the performance bar changes colors. Here's what those colors mean:
- Green: We're confident that this combination is doing better than the original.
- Yellow: This combination could be doing a bit better or worse than the original, but there isn't enough data to be sure yet.
- Gray: This combination is performing at the same level as the original.
- Red: This combination isn't doing as well as the original.
The chance to beat original column displays the probability that a combination will be more successful than the original version. When numbers in this column are high, around 95%, that means a given combination is probably a good candidate to replace your original content. Low numbers in this column mean that the corresponding combination is a poor candidate for replacement.
Observed improvement displays the percent improvement over the original combination or variation. Because this percentage is a ratio of the conversion rate of a combination to the conversion rate of the original column, it will often vary widely. We suggest that you only concentrate on the improvement when a large amount of data has been collected and it can be considered more reliable.
Conversions/visitor is just that -- the raw data of how many conversions a particular combination or variation generated, divided by the number of visitors.
When the experiment has been running long enough to gather enough data, and Website Optimizer can clearly tell that one variation or combination is performing well, you'll see your report show an alert that a high-confidence winner has been found. This alert lets you know that one or more of the combinations or variations you are testing (depending on the type of experiment) has significantly out-performed your original content.
No comments:
Post a Comment