Chapter 5. Measuring and Prioritizing Value

The next aspect of answering the EDGE question “How should we invest?” is developing appropriate Measures of Success (MoS). Although your Lean Value Tree (LVT) statements are outcome-oriented, you have to determine how to measure those outcomes. Without definitive MoS, you’re left with flowery statements for which success or failure is arbitrary. It’s one thing to establish a goal; it’s quite another to identify an appropriate measure to both prioritize that goal against other investment demands and then monitor progress as initiatives move forward. Much as acceptance criteria enrich agile stories and provide critical information for developers to understand the requirements, MoS help describe the desired outcome of the work in a way that people doing the work can test to determine whether they are on track.

Why Measures Matter

Having a clear understanding of the value that the organization expects from a given portfolio of works is critical to maximizing overall value. One common misconception about measures is that they are created after the work is defined, and are used to track activity progress. Measures that are defined and articulated correctly become a powerful way to shape the work that will help achieve the desired outcome without constraining creativity.

There are three primary reasons for using MoS in EDGE:

  • MoS help leadership shape and align the work, without prescribing a specific solution.

  • MoS replace deliverables as the primary description of what is expected from the team doing the work.

  • MoS are used throughout the delivery process to demonstrate progress, prioritize work, and support decision making on incremental funding.

Identifying Measures of Success

Ideally, MoS represent customer value—a measurement of something a customer sees as valuable (Figure 5-1). Outcomes that the organization desires, but a customer does not directly recognize as valuable, are called “benefits.” The differentiation between customer value and internal business benefit is another important step in transitioning to a customer-centric view of your enterprise.

MOS of customer value is shown.

Figure 5-1 MoS should emphasize customer value as much as possible.

Customer Value

Customer-value MoS, such as delivery time (order to receipt) and customer satisfaction, are good measures of outcomes that represent customer value. You should strive for MoS that emphasize customer value within the available data and measurement constraints.

Business Benefits

Revenue, profit, market share, and time to market are measures of benefits that are desirable to the organization, but not something a customer sees as valuable. Often benefits are useful as “guardrails” to keep the customer-focused team from giving away the store.

Activity Measures

Being on schedule, budget variance, velocity, and defect count are all examples of measures of activity that provide no directional guidance to the delivery team, and have a tenuous relationship with value. Measures of activity should be used within teams only to enable learning and continuous improvement. Measurements of activity should never be used to evaluate goal, bet, or initiative value.

Leading and Lagging Measures

Measures are often characterized as either leading or lagging indicators. In the context of EDGE, this is another good way to help identify useful MoS.

As an example, a MoS at the goal level that is very value-oriented—such as customer satisfaction (a lagging indicator)—might take a while to have an impact, and many different factors might influence it. It provides useful guidance at a goal level because it focuses the investment on the customer and isn’t overly prescriptive of the solution. However, it’s ill suited to steer initiatives because it’s not very sensitive and there can be a significant delay between action and a change in the result.

Conversely, MoS for initiatives are typically leading indicators, as the examples in Figure 5-2 illustrate. These behavior-based measures are a critical source of feedback and guide decision making. They’re more sensitive and therefore useful for prioritization and faster decision making—which is useful at the initiative level.

A table shows bets, initiatives, and measures of success of a particular goal.

Figure 5-2 Example goal, bets, initiatives, and measures of success.

There are certainly risks in focusing on leading indicators: You may make erroneous assumptions about the relationship between the leading indicators (inputs) and the lagging indicators (outcomes). Were that to happen, you’d be steering your initiatives with leading indicator MoS that looked good, but won’t achieve your desired outcome.

Our recommendation is that you use leading indicator MoS only when you have high confidence in the correlation between those measures and customer value. If you don’t have that confidence, then design some of your early activities to prove your hypothesis, before committing substantial investment. Figure 5-3 depicts how customer value, business benefit, and activity measures align on the LVT.

A figure shows an LVT aligned with a different type of measures.

Figure 5-3 How customer value, business benefit, and activity measures align on the LVT.

Number of Measures

You may need several MoS to describe the desired outcome for a portfolio (goal, bet, or initiative). Having too many MoS, however, can be counterproductive. Using a single MoS can also have undesired consequences.

For example, if you had a solitary MoS, such as customer satisfaction, it might drive satisfaction at the expense of profitability. You can avoid this undesirable outcome by adding the “guardrail” measure of profitability (a benefit), which provides the guidance that solutions should optimize customer satisfaction and profitability. Then, when your delivery team prioritizes alternative solutions, it will favor those that increase both customer satisfaction and profitability over those that impact only one of those measures.

A small set (one to three) of MoS should be crafted so that high-value options are clearly identifiable, and options that reduce value can also be recognized and avoided.

MoS that are highly correlated with each other are not helpful in the same portfolio, since by definition, if one shows positive results, so will the others. Pick the measure that’s most meaningful to your team and that you can capture with the least effort.

Applying MoS to Portfolios

In EDGE, clear MoS are required for every goal, bet, and initiative. They differ in type, but it should be easy to see how MoS align and contribute to the parent as you traverse up a LVT branch, from initiative to goal.

Using MoS to Align and Differentiate Portfolios

As mentioned in Chapter 4, Building a Value-Driven Portfolio, each portfolio in the LVT should be mutually exclusive and collectively exhaustive (MECE).1 MoS help to differentiate the portfolios within the LVT. All bets for a given goal should contribute to the MoS of the goal. At the same time, each bet should have some unique MoS or a unique effect on its parent MoS that distinguishes it from the other bets in that goal.

1. Minto, Barbara. The Pyramid Principle: Logic in Writing and Thinking. 3rd ed. Harlow, UK: Prentice Hall, 2010.

Similarly, MoS for initiatives should help differentiate each one from other initiatives within a bet.

Prioritizing Value

For many organizations, prioritization is extremely difficult—they want to do everything. Deciding how much to invest in existing goals versus new goals requires thought, judgment, and some luck. To reduce the risk of leading the organization down the wrong path, prioritization should be done on a regular cadence and based on customer value.

In traditional portfolio management, prioritization processes tend to focus on estimated return on investment (ROI) and on fully utilizing resources (e.g., money, people). In EDGE, an organization uses the MoS to describe the outcome it wants from each portfolio, and it uses those MoS to rank and prioritize the work in the portfolio. Prioritizing using MoS ensures an organization is working on those items that produce the maximum value.

Prioritization Approaches

Many different methods for prioritization exist. In keeping with most of our advice in EDGE, the right approach for you is the one that works. However, we would suggest the following criteria to evaluate if your approach is serving your organization well:

  • Does your approach result in the highest-value portfolio item on top of your list?

  • Is the effort required to accomplish the work considered so that portfolio items that have the same value contribution are sequenced with lowest effort first?

  • Is there a way to incorporate other factors that affect ROI into the decision?

  • Can you apply your method with the information you have available at the last responsible moment?

  • Can your method be used quickly to appropriately incorporate new ideas into the prioritized list?

  • Does your method produce a rank-order list (no ties)?

Relative Value Scoring

One approach to prioritizing that meets these criteria is relative value scoring. When ranking items, owners or teams don’t attempt to predict the exact magnitude of the impact on MoS, only the relative impact compared to the other items in that portfolio. Owners or teams then use the same relative approach to forecast the investment or effort required to accomplish something. This approach requires a collaborative effort that leverages everyone’s domain knowledge and experience to quickly make prioritization decisions. An added benefit is that it’s easily adjusted as new information becomes available because this method relies on visibility, collaboration, and relative ranking.

Relative value scoring is substantially different from traditional portfolio management approaches, where great effort is expended to make some kind of upfront ROI justification. ROI is based on a series of assumptions that require validation, and shouldn’t be used as the sole basis for prioritization. Doing so will constrain your ability to decide what’s most valuable and to check those decisions as work is delivered and you learn.

Using a gross measurement scale such as “Low, Medium, High,” or T-shirt sizes (S, M, L), or Fibonacci numbers (1, 2, 3, 5, 8, ...), you utilize the wisdom of the portfolio owner team to assign a value impact score for each MoS, to each of the portfolio items. Adding up the value scores for each item and sorting this list makes the highest-value items visible, as shown in Figure 5-4. What is important in this approach is to assign scores relative to each other within the portfolio. In other words, when you are done, the items with the highest scores represent the most value relative to the other items in the list. This step gets you halfway there—you now know what is most valuable in the portfolio.

A table signifies the impact of value scoring.

Figure 5-4 Value impact scoring.

Next, you need to incorporate the effort component into your scheme. By “effort,” we mean the investments you will be required to make to get the value you have estimated. Effort is often expressed as money, but that doesn’t have to be the only component. For example, in some situations, capacity for change can be a significant constraint and it can be useful to incorporate this factor into the effort side of the equation. Other possible Measures of Effort (MoE) are time, risk, and complexity. You can (and should) use whatever MoE help you find the lowest-effort items based on your experience. You can use the same relative scoring method employed in Figure 5-4 to assign an effort score to each item in the portfolio, total the score for each item, and sort the list as in Figure 5-5. Now you know which items will take the most effort to accomplish.

A table signifies the effort of impact score.

Figure 5-5 Effort impact scoring.

Next, combine the two components, dividing the value score by the effort score for each item. Sorting the resulting data will sequence the portfolio items with the highest-value and lowest-effort items on top, as shown in Figure 5-6. This creates the backlog that the portfolio owner and team will manage.

A table signifies the value and effort scoring.

Figure 5-6 Combining value and effort scoring.

One word of caution here: You want to keep your MoS and MoE scheme as simple as possible. It’s important that you can very quickly assign scores based on the wisdom of the team without significant delays for analysis.

Cost of Delay

A more sophisticated way of prioritizing portfolio items is to use Cost of Delay (CoD).2 Fundamentally, CoD is the value of having the desired work completed earlier. Typically it is expressed as the monetary value of a one-month change in receiving the work. For example, if a new software feature is expected to improve the customer experience in such a way that it you would have a positive impact on your customer retention rate (a MoS for the portfolio), you would use the monetary value of that change in customer retention for one month as the CoD for the feature. The assumption is that if you delayed the implementation of that feature for one month, you would not get the benefit of that positive impact on customer retention.

2. See Don Reinertsen’s book for a complete description of CoD. Reinertsen, Donald G. The Principles of Product Development Flow: Second Generation Lean Product Development. Redondo Beach, CA: Celeritas Publishing, 2009.

In this example, you can see that you need to have a good understanding of the relationship between your feature and its impact on customer retention. Similar understanding is required for every item in your portfolio to be useful for prioritization. For mature portfolios with delivery teams that deeply understand their portfolio domain and have the necessary infrastructure to capture and analyze their MoS data, this might be possible. In our experience with some of the world’s largest organizations, however, this level of sophistication is usually far beyond their current capabilities. For a really good description of the effort involved in implementing CoD for portfolio prioritization, see Blackswan Farming’s white paper3 on their experience with Maersk.

3. Arnold, Joshua, and Özlem Yüce. “Experience Report: Maersk Line” Black Swan Farming [blog], 2013. http://blackswanfarming.com/experience-report-maersk-line/.

For the sake of completeness, let’s finish our example of applying CoD for prioritization. In the example, CoD is used to represent the value of the feature. This replaces the relative value MoS score used in the earlier example. To complete the equation, we bring the measure of effort concept into the picture. This step allows us to identify the most valuable features that require the least effort to deliver. This method is usually referred to as Cost of Delay divided by duration (CD3).4 To finish the prioritization, divide the CoD by the duration to deliver the feature and get a numeric score that can be ranked as shown in Figure 5-6. This method is irresistible to organizations that like the precision of its calculations.

4. Reinertsen, Donald G. The Principles of Product Development Flow: Second Generation Lean Product Development. Redondo Beach, CA: Celeritas Publishing, 2009.

Our experience suggests some additional cautions are warranted. Besides requiring deep understanding of the relationship between your candidate work and your desired outcome, CD3 strictly focuses on duration, as the entire scheme hinges on utilizing constrained resources to deliver maximum value. For a technology organization that is myopically focused on getting maximum efficiency out of the resources with which it has been entrusted, this is a very effective method. However, EDGE suggests that you should focus on actual value realized—and that involves far more than the technology organization delivering its work. Indeed, the entire value stream must operate in concert to deliver that value.

With CD3, the duration component does not provide the opportunity to consider other types of effort, as described earlier. For example, we have seen portfolios where the change management effort required for implementation was a more significant constraint than the duration. Reputational risk is another factor that can drive prioritization decisions in a more significant way than the time it would take, or the cost associated with the development of the technology. Certainly, there are ways to incorporate these concerns into a CD3 base prioritization scheme. However, we believe this level of sophistication is beyond many teams, and therefore this approach could create a trap that gives the appearance of precision, but has unintended consequences.

Our advice is to start with relative value scoring. Then, when you have a well-functioning process that needs further improvement, consider experimenting with CD3 in some of your highly dynamic portfolios.

Managing the Strategic Backlog

Businesses always have more ideas than they have had capacity to investigate those ideas. So, as you construct your LVT, you’ll likely end up with things that are not yet funded and may be backlog candidates. As you deliver on initiatives, you’ll also gather ideas by learning what helps the business and what hurts it. As always, your competition and the market won’t stand still; thus, new ideas will be generated to address new developments as the market changes.

One benefit of having the entire organization aligned and more familiar with the investment profile through the LVT is that many new ideas will emerge—sometimes from unexpected places. This is a good thing, but such growth needs to be managed so you don’t end up overwhelmed. The VRT helps with this by collecting all of the ideas in a pool of potential backlog items. In some organizations, we call this unvetted pool the “inbox.”

The VRT also schedules regular reviews of potential ideas to prioritize them for grooming and possible inclusion in the backlog of bets or initiatives. The intention is to maintain a small, healthy, and well-vetted backlog of ideas that could be taken on for work as capacity frees up.

Prioritization Challenges

For long-lived delivery teams, initiative backlogs5 must include sustaining (break-fix) and tech-debt items in addition to new work. The sources of these items will vary depending on the organization. Common ones are service desks, incident ticket systems, and field support. It is the portfolio owners’ responsibility (with the VRT’s help) to ensure that all sources of potential work are fed into an integrated backlog of work to be considered. There can be no “back door” into the system. All these types of work are prioritized using the same ranking framework as new ideas.

5. For more about integrated backlogs, see Chapter 7, Integrating Strategic and Business as Usual Portfolios.

Work intake is straightforward once the LVT and the MoS have been established and the prioritization mechanism is in place. Since the highest-priority items are always at the top of the backlog, new work can be taken on when capacity is freed up.

Final Thoughts

Prioritizing in this way often runs counter to the culture of large organizations, which typically desire precision in their estimating processes that feed prioritization. Cost/benefit analysis that produces ROI calculations provides a level of comfort to decision makers who are faced with comparing complex and costly investments.

In our experience, most organizations become good at predicting only what they have extensive experience in. They’re able to use the data they have accumulated and the consistency of their process to create accurate statistical predictions.

Unfortunately, when you’re searching for customer value in a highly dynamic and competitive marketplace, you never have that experience. You are pioneering—searching for new sources of value and routes to get there. If you apply estimation methods that rely heavily on data to drive your prioritization process, then that process will likely slow down as you attempt to gather the data you need. Your value stream will be constantly changing as you attempt to optimize your results. Both of these factors will conspire to make your attempts at statistical estimation futile.

As an alternative, we’ve proposed a method of prioritizing that emphasizes speed rather than precision. Speed enables you to get to the learning as quickly as possible, rather than remaining in “analysis paralysis.” The relative value scoring method described in this chapter relies on the collective wisdom of your team to weigh the data you do have, the team’s experience in the domain, and their own value-creating capability in an attempt to discern the relative differences in value and effort within their portfolio. Team members can’t tell you exactly how much value you’ll get from a particular investment, but they can tell you whether you are likely to get the most value from one particular investment versus another.

Applying this approach to prioritization requires the components of EDGE to be in place: long-lived teams that have deep domain knowledge and have had the opportunity to optimize their ways of working to be consistent. The traditional program/project operating model is missing several of the key ingredients that make this approach successful. For example, in a traditional project-based operating model, teams tend to be short-lived, and assigned to work based on availability. Because they change domains frequently, they don’t have the opportunity to learn about the domain they’re working within. They also don’t have the opportunity to optimize their work process because the team members are changing frequently. In EDGE, we advocate that those teams stay focused on a particular domain for a longer period of time.6 This affords team members the opportunity to learn deeply about what creates value in that domain and the opportunity to optimize their way of working and become more consistent.

6. See Chapter 9, Autonomous Teams and Collaborative Decision Making.

As you can see, this approach relies heavily on the people in the system. It requires investing in their learning and developing trust within the culture. For some organizations, that may be a substantial undertaking, but one we believe can produce a substantial ROI.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset