Management And Accounting Web

Anderson, E. T. and D. Simester. 2011. A step-by-step guide to smart business experiments. Harvard Business Review (March): 98-105.

Summary by James R. Martin, Ph.D., CMA
Professor Emeritus, University of South Florida

Lab and Experimental Research Main  |  Decision Theory Main Page

Analytics focused on dissecting data from historical transactions is relatively complicated. It is much easier to interpret the data generated with experiments. Therefore, most companies will derive more value from simple business experiments than from analyzing big data. The purpose of this paper is to provide a step-by-step guide to conducting smart business experiments. Managers need to embrace the test-and-learn approach. The idea is simply to take an action with one (test or treatment) group of customers, then take a different action (or no action) with another (control) group of customers, and compare the results.

Testing Customers' Responses Requires Thinking Like a Scientist

Testing customers' responses is fairly easy for some companies (e.g., catalog companies, direct mail houses, and online retailers), and more difficult, or nearly impossible for others (e.g., those that rely on television advertising). Most companies fall in between these extremes. A scientifically valid business experiment requires: A treatment group, a control group, and a feedback mechanism.

Treatment and control groups should be selected randomly, and separated to prevent the actions taken with a treatment group from spilling over to the control group. This may be difficult where customers visit multiple store locations, or where customers visit a website repeatedly interacting with different versions of the site. One way to keep the groups separate is to vary actions over time.

A feedback mechanism is needed to observe how customers react to different treatments. These mechanisms can be behavioral (e.g., measure actions such as clicking on ads on a website, or making actual purchases), or perceptual (e.g., ask customers how they think they will respond to changes using surveys, focus groups, or conjoint studies1).

Seven Rules for Running Experiments

1. Focus on individuals and think short term - Start with experiments that are easy to implement and provide quick, clear insights, e.g., lower a price, or send out a direct mail offer, and then observe how customers react.

2. Keep it simple - Use experiments that are easy to implement with the firm's current resources and staff.

3. Start with a proof-of-concept test - Change as many variables as the analyst believes are necessary to get the desired result.

4. When the results are obtained, slice the data - Don't just look at the aggregate data. Look for subgroups within the treatment and control groups, e.g., men vs. women customers.

5. Try out-of-the-box or what-if thinking - Don't just incrementally adjust current policies, e.g., rather than adjusting prices, try different sales approaches, or cooperative advertising.

6. Feedback should measure everything that matters - Feedback should capture all relevant effects.

7. Look for natural experiments - Find treatment and control groups that are created by some outside factor, not specifically developed for an experiment. Geographic segmentations provides one type of example. Another example of natural groups is where different states treat online sales taxes differently.

Avoid Obstacles

There are both internal and external obstacles to experimentation. For example, charging different prices to different groups can create an adverse customer reaction. The attitude of managers who believe intuition, or gut instinct is the best way to make decisions provides another example. The test-and-learn approach requires a willingness to experiment with many different changes. In cases where the possible treatments are extremely large, companies can plan and pretest experiments using analytics.

Regardless of the type of experiments conducted, the goal is to shift the organization to a culture of making decisions based on experimentation rather than intuition, i.e., a data driven test-and-learn approach.

____________________________________________________

Footnotes:

1 Wikipedia defines conjoint analysis as "a survey-based statistical technique used in market research that helps determine how people value different attributes (feature, function, benefits) that make up an individual product or service". For many A/B and multivariate testing tools see MAAW's Experimental Research Tools and Links.

Related summaries:

Appelbaum, D., A. Kogan and M. A. Vasarhelyi. 2017. An introduction to data analysis for auditors and accountants. The CPA Journal (February): 32-37. (Summary).

Appelbaum, D., A. Kogan, M. Vasarhelyi and Z. Yan. 2017. Impact of business analytics and enterprise systems on managerial accounting. International Journal of Accounting Information Systems (25): 29-44. (Summary).

Davenport, T. H. 1998. Putting the enterprise into the enterprise system. Harvard Business Review (July-August): 121-131. (Summary).

Davenport, T. H. 2009. How to design smart business experiments. Harvard Business Review (February): 68-76. (Summary).

Davenport, T. H. and J. Glaser. 2002. Just-in-time delivery comes to knowledge management. Harvard Business Review (July): 107-111. (Summary).

Kohavi, R. and S. Thomke. 2017. The surprising power of online experiments: Getting the most out of A/B and other controlled tests. Harvard Business Review (September/October): 74-82. (Summary).

Spear, S. J. 2004. Learning to lead at Toyota. Harvard Business Review (May): 78-86. (Summary).

Spear, S. and H. K. Bowen. 1999. Decoding the DNA of the Toyota production system. Harvard Business Review (September-October): 97-106. (Summary).

Thomke, S. and J. Manzi. 2014. The discipline of business experimentation. Increase your chances of success with innovation test-drives. Harvard Business Review (December): 70-79. (Summary).

Tschakert, N., J. Kokina, S. Kozlowski and M. Vasarhelyi. 2017. How business schools can integrate data analytics into the accounting curriculum. The CPA Journal (September): 10-12. (Summary).