What counts is decisions, not numbers—Toward an analytics design sheet

G. Ruhe; M. Nayebi    University of Calgary, Calgary, AB, Canada

Abstract

Decision-making is the process of determining a course of action to achieve a goal, and fulfill given constraints. The variety of product release decisions (What features to release next? When to release them? Focus on functionality versus quality of features?) is just one example of the wide spectrum of decisions that have to be made during the different stages of the software life-cycle. Some of these decisions are related to operational questions (who should fix this bug?), others are more strategic decisions in nature (outsourcing which parts of development? Migration to another platform?). In all these cases, one wants to select best feasible alternative based on some evaluation criteria. All these decisions are made by humans. However, the difference is in how much the human decision makers can rely on insight gained from analytics, and how much of it is just based on intuition. We propose the analytics design sheet (ADS) as a guide for selecting the right analytics to support decision-making.

Keywords

Modeling and scoping; Information gathering; Analytics design sheet; Influence diagram

Decisions Everywhere

In the age of app stores, data from users and the market is collected on almost everything: features and their evolution, code and all the changes made, number of downloads, rating of the apps, reviews done by the users, and performance of competitors in the market. In presence of all that, how can analytics support development of highly successful apps?

Decision-making is the process of determining a course of action to achieve a (explicitly stated or not) stated goal, and fulfill given constraints. The variety of product release decisions (What features to release next? When to release them? Focus on functionality versus quality of features?) is just one example of the wide spectrum of decisions that have to be made during the different stages of the software life-cycle. Some of these decisions are related to operational questions (who should fix this bug?), others are more strategic decisions in nature (outsourcing which parts of development? Migration to another platform?). In all these cases, one wants to select the best alternative based on some evaluation criteria. The alternatives need to be feasible in terms of some hard and soft constraints on cost, time, quality or technical conditions.

In the end, all these decisions are made by humans. However, the difference is in how much the human decision makers can rely on insight gained from analytics, and how much of it is just based on intuition. In the context of release decisions for mobile apps, we found that about half of the app owners make rationale-based decisions for releasing their products [1].

The Decision-Making Process

Russo and Schoemaker [2] recommend that people in business must approach decision making with a clear process and plan. We look at this process from an analytics perspective and define what analytics can provide as far as key steps of the decision-making process:

Step 1 Modeling and scoping: A conceptual model is created describing the scope and context of the decision studied. This includes decision variables, independent variables and context factors to be taken into account.

Step 2 Information gathering: Based on the model created, information from different sources is retrieved and pre-processed. The scope of information gathering should include sources inside and outside the organizational context you are in.

Step 3 Identify and evaluate alternatives: As a form of synthesis, information is explored to determine possible and desirable alternatives. This is a creative process where missing some of the alternatives might be a cause of missing the best possible decision. Having too many alternatives makes the final decision more difficult.

Step 4 Select one alternative: Among the identified alternatives from Step 3, select the one fulfilling potential hard and soft constraints that is ranked highest. This selection is a human-based activity, and all analytics done in previous steps should be in support of this selection.

Step 5 Implement: The selected alternative is implemented as the solution to the original problem.

Step 6 Monitor and adjust to change: The need to adjust to changes is the result of the inherent uncertainties in the actual decision context. Effort, cost, value or market conditions might change, and this implies that a partially implemented solution needs to be adjusted to better match with the new situation.

While Steps 1 and 5 are not directly related to analytics, the other steps mostly benefit from it. In what follows, we describe how to further qualify this process.

The Analytics Design Sheet

One of the key mistakes in data mining is “Running algorithms repeatedly and blindly” for mining data [3]. We propose the analytics design sheet (ADS) as a guide for selecting the right analytics to support decision-making. The ADS consists of four quadrants, Q1 to Q4, devoted to the decision problem specified in the heading of the sheet:

 Context: Description of problem context factors and problem formulation (Q1).

 Decision: High-level specification of the decision to be made (Q2).

 Data: Availability of data (Q3).

 Analytics: Selection of helpful analytics techniques (Q4).

In Q1 of the ADS, problem scoping and formulation is addressed. It is important to address the right analytics to properly understand the context and actual real-world problem. Following, the specific decision problem to be tackled needs to be outlined. As such, Q2 represents an informal model of the decision problem under consideration. One way to do this is using an Influence Diagram [4], which is a simple visual representation of a decision problem. No perfection or completeness is expected at this stage; data analytics is an adaptive process with an increasing level of detail.

The third quadrant, Q3, evaluates key features of data and their availability. Finally, Q4 provides (human expert-based opinion on) alternatives for selecting the appropriate analytics. This is not meant to be a prescriptive selection, but more like a brainstorming of potential analytical techniques applicable for the stated problem.

Example: App Store Release Analysis

We illustrate the idea of the ADS by an example taken from the domain of app store analytics. With the mass of available explicit and implicit user feedback, synergies between goal-oriented analytics and human expertise are needed to make good decisions [5]. A team of game app developers decided to gain more visibility for their app in the GooglePlay store by adding new incentives to the game. To this end, a prototype of a new feature is implemented and offered via a beta release. Over a period of time, they further monitor the feature usage (Q1). The actual decision problem is to include or not include the new feature based on the monitored trial usage and the predicted effort for implementing the full functionality of the feature (Q2). For analysis, there is a real-world data set from recording the feature usage (frequency and duration of using the new feature). Furthermore, there is another (historical) data set describing the actual effort from implementing similar features in the past (Q3). Selecting from the variety of techniques outlined by Bird et al. in [6], time-series analysis, predictive modeling and what-if benefit analysis are suggested in Q4. As a result, a recommendation is given on whether the new feature should be included or not. In addition, this suggestion is supported by data analytics. The sample ADS sheet is shown in Fig. 1.

f22-01-9780128042069
Fig. 1 Sample analytics design sheet in support of the decision to add a new feature.

The ADS is a semi-formal approach, leveraging existing knowledge and experience. The sheet is intended to support brainstorming and facilitating discussion among stakeholders. The selection of specific analytical techniques is outside the scope of the sheet. Overall, the proposed approach is intended to facilitate the transition from numbers to support actual decision-making.

References

[1] Nayebi M., Adams B., Ruhe G. Mobile app releases—a survey research on developers and users perception. In: Proceedings SANER; 2016.

[2] Russo J.E., Schoemaker P.J. Winning decisions: getting it right the first time. New York: Doubleday; 2002.

[3] Delen D. Real-world data mining. Upper Saddle River: Pearson Education; 2014.

[4] Shachter R.D. Evaluating influence diagrams. Oper Res. 1986;34(6):871–882.

[5] Maalej W., Nayebi M., Johan T., Ruhe G. Toward data driven requirement engineering. Special issue on The Future of Software Engineering. IEEE Software. 2016;33(1):48–54.

[6] Bird C., Menzies T., Zimmermann T. The art and science of analyzing software data. Burlington: Morgan Kaufman; 2015.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset