The purpose of the value measuring methodology (VMM) is to define, capture, and measure value associated with electronic services unaccounted for in traditional return-on-investment (ROI) calculations, to fully account for costs, and to identify and consider risk. Developed in response to the changing definition of value brought on by the advent of the Internet and advanced software technology, VMM incorporates aspects of numerous traditional business analysis theories and methodologies, as well as newer hybrid approaches.
VMM was designed to be used by organizations across the federal government to steer the development of an e-government initiative, assist decision-makers in choosing among investment alternatives, provide the information required to manage effectively, and to maximize the benefit of an investment to the government.
VMM is based on public and private sector business and economic analysis theories and best practices. It provides the structure, tools, and techniques for comprehensive quantitative analysis and comparison of value (benefits), cost, and risk at the appropriate level of detail.
This appendix provides a high-level overview of the four steps that form the VMM framework. The terminology used to describe the steps should be familiar to those involved in developing, selecting, justifying, and managing an information technology (IT) investment
A decision framework provides a structure for defining the objectives of an initiative, analyzing alternatives, and managing and evaluating ongoing performance. Just as an outline defines a paper’s organization before it is written, a decision framework creates an outline for designing, analyzing, and selecting an initiative for investment, and then managing the investment. The framework can be a tool that management uses to communicate its agency government-wide, or focus-area priorities.
The framework facilitates establishing consistent measures for evaluating current and/or proposed initiatives. Program managers may use the decision framework as a tool to understand and prioritize the needs of customers and the organization’s business goals. In addition, it encourages early consideration of risk and thorough planning practices directly related to effective e-government initiative implementation.
The decision framework should be developed as early as possible in the development of a technology initiative. Employing the framework at the earliest phase of development makes it an effective tool for defining the benefits that an initiative will deliver, the risks that are likely to jeopardize its success, and the anticipated costs that must be secured and managed.
The decision framework is also helpful later in the development process as a tool to validate the direction of an initiative, or to evaluate an initiative that has already been implemented.
The decision framework consists of value (benefits), cost, and risk structures, as shown in Figure A8.1. Each of these three elements must be understood to plan, justify, implement, evaluate, and manage an investment.
The tasks and outputs involved with creating a sound decision framework include
The value structure describes and prioritizes benefits in two layers. The first considers an initiative’s ability to deliver value within each of the five value factors (user value, social value, financial value, operational and foundational value, and strategic value). The second layer delineates the measures to define those values.
By defining the value structure, managers gain a prioritized understanding of the needs of stakeholders. This task also requires the definition of metrics and targets critical to the comparison of alternatives and performance evaluation.
The value factors consist of five separate, but related, perspectives on value. As defined in Figure A8.2, each factor contributes to the full breadth and depth of the value offered by the initiative.
Because the value factors are usually not equal in importance, they must be “weighted” in accordance with their importance to executive management.
Identification, definition, and prioritization of measures of success must be performed within each value factor, as shown in Figure A8.3. Valid results depend on project staff working directly with representatives of user communities to define and array the measures in order of importance. These measures are used to define alternatives, and also serve as a basis for alternatives analysis, comparison, and selection, as well as ongoing performance evaluation.
In some instances, measures may be defined at a higher level to be applied across a related group of initiatives, such as organization-wide or across a focus-area portfolio. These standardized measures then facilitate “apples-to-apples” comparison across multiple initiatives. This provides a standard management “yardstick” against which to judge investments.
Whether a measure has been defined by project staff or at a higher level of management, it must include the identification of a metric, a target, and a normalized scale. The normalized scale provides a method for integrating objective and subjective measures of value into a single decision metric. The scale used is not important; what is important is that the scale remains consistent.
The measures within the value factors are prioritized by representatives from the user and stakeholder communities during facilitated group sessions.
The risk associated with an investment in a technology initiative may degrade performance, impede implementation, and/or increase costs. Risk that is not identified cannot be mitigated or managed, causing a project to fail either in the pursuit of funding or, more dramatically, during implementation. The greater the attention paid to mitigating and managing risk, the greater the probability of success.
The risk structure serves a dual purpose. First, the structure provides the starting point for identifying and inventorying potential risk factors that may jeopardize an initiative’s success and ensures that plans for mitigating their impact are developed and incorporated into each viable alternative solution.
Second, the structure provides the information management needs to communicate their organization’s tolerance for risk. Risk tolerance is expressed in terms of cost (what is the maximum acceptable cost “creep” beyond projected cost) and value (what is the maximum tolerable performance slippage).
Risks are identified and documented during working sessions with stakeholders. Issues raised during preliminary planning sessions are discovered, defined, and documented. The result is an initial risk inventory.
To map risk tolerance boundaries, selected knowledgeable staff are polled to identify at least five data points that will define the highest acceptable level of risk for cost and value.
A cost structure is a hierarchy of elements created specifically to accomplish the development of a cost estimate, and is also called a cost element structure (CES).
The most significant objective in the development of a cost structure is to ensure a complete, comprehensive cost estimate and to reduce the risk of missing costs or double counting. An accurate and complete cost estimate is critical for an initiative’s success. Incomplete or inaccurate estimates can result in exceeding the budget for implementation requiring justification for additional funding or a reduction in scope. The cost structure developed in this step will be used during Step 2 to estimate the cost for each alternative.
Ideally, a cost structure will be produced early in development, prior to defining alternatives. However, a cost structure can be developed after an alternative has been selected or, in some cases, in the early stage of implementation. Early structuring of costs guides refinement and improvement of the estimate during the progress of planning and implementation.
Documentation of the elements leading to the selection of a particular alternative above all others is the “audit trail” for the decision. The documentation of assumptions, the analysis, the data, the decisions, and the rationale behind them are the foundation for the business case and the record of information required to defend a cost estimate or value analysis.
Early documentation will capture the conceptual solution, desired benefits, and attendant global assumptions (e.g., economic factors such as the discount and inflation rates). The documentation also includes project-specific drivers and assumptions, derived from tailoring the structures.
The basis for the estimate, including assumptions and business rules, should be organized in an easy-to-follow manner that links to all other analysis processes and requirements. This will provide easy access to information supporting the course of action, and will also ease the burden associated with preparing investment justification documents. As an initiative evolves through the life cycle, becoming better defined and more specific, the documentation will also mature in specificity and definition.
An alternatives analysis is an estimation and evaluation of all value, cost, and risk factors (Figure A8.4) leading to the selection of the most effective plan of action to address a specific business issue (e.g., service, policy, regulation, business process or system). An alternative that must be considered is the “base case.” The base case is the alternative where no change is made to current practices or systems. All other alternatives are compared against the base case, as well as with each other.
An alternatives analysis requires a disciplined process to consider the range of possible actions to achieve the desired benefits. The rigor of the process to develop the information on which to base the alternatives evaluation yields the data required to justify an investment or course of action. It also provides the information required to support the completion of the budget justification documents. The process also produces a baseline of anticipated value, costs, and risks to guide the management and ongoing evaluation of an investment.
An alternatives analysis must consistently assess the value, cost, and risk associated with more than one alternative for a specific initiative. Alternatives must include the base case and accommodate specific parameters of the decision framework. VMM, properly used, is designed to avoid “analysis paralysis.”
The estimation of cost and the projection of value use ranges to define the individual elements of each structure. Those ranges are then subject to an uncertainty analysis (see Note 1). The result is a range of expected values and cost. Next, a sensitivity analysis (see Note 2) identifies the variables that have a significant impact on this expected value and cost. The analyses will increase confidence in the accuracy of the cost and predicted performance estimates (Figure A8.5). However, a risk analysis is critical to determine the degree to which other factors may drive up expected costs or degrade predicted performance.
An alternatives analysis must be carried out periodically throughout the life cycle of an initiative. The following list provides an overview of how the business value resulting from an alternatives analysis changes, depending on where in the life cycle the analysis is conducted.
The tasks and outputs involved with conducting an alternatives analysis include
The challenge of this task is to identify viable alternatives that have the potential to deliver an optimum mix of both value and cost-efficiency. Decision-makers must be given, at a minimum, two alternatives plus the base case to make an informed investment decision.
The starting point for developing alternatives should be the information in the value structure and preliminary drivers identified in the initial basis of estimate (see Step 1).
Using this information will help to ensure that the alternatives and, ultimately, the solution chosen, accurately reflect a balance of performance, priorities, and business imperatives. Successfully identifying and defining alternatives requires cross-functional collaboration and discussion among the stakeholders.
The base case explores the impact of identified drivers on value and cost if an alternative solution is not implemented. That may mean that current processes and systems are kept in place or that organizations will build a patchwork of incompatible, disparate solutions. There should always be a base case included in the analysis of alternatives.
Comparison of alternatives, justification for funding, creation of a baseline against which ongoing performance may be compared, and development of a foundation for more detailed planning require an accurate estimate of an initiative’s cost and value. The more reliable the estimated value and cost of the alternatives, the greater confidence one can have in the investment decision.
The first activity to pursue when estimating value and cost is the collection of data. Data sources and detail will vary based on an initiative’s stage of development. Organizations should recognize that more detailed information may be available at a later stage in the process and should provide best estimates in the early stages, rather than delaying the process by continuing to search for information that is likely not available.
To capture cost and performance data, and conduct the VMM analyses, a VMM model should be constructed. The model facilitates the normalization and aggregation of cost and value, as well as the performance of uncertainty, sensitivity, and risk analyses.
Analysts populate the model with the dollar amounts for each cost element and projected performance for each measure. These predicted values, or the underlying drivers, will be expressed in ranges (e.g., low, expected, or high). The range between the low and high values will be determined based on the amount of uncertainty associated with the projection.
Initial cost and value estimates are rarely accurate. Uncertainty and sensitivity analyses increase confidence that likely cost and value have been identified for each alternative.
The only risks that can be managed are those that have been identified and assessed. A risk analysis considers the probability and potential negative impact of specific factors on an organization’s ability to realize projected benefits or estimated cost, as shown in Figure A8.6.
Even after diligent and comprehensive risk mitigation during the planning stage, some level of residual risk will remain that may lead to increased costs and decreased performance. A rigorous risk analysis will help an organization better understand the
probability that a risk will occur and the level of impact the occurrence of the risk will have on both cost and value. Additionally, risk analysis provides a foundation for building a comprehensive risk-management plan.
Inherent in these activities is the need to document the assumptions and research that compensate for gaps in information or understanding. For each alternative, the initial documentation of the high-level assumptions and risks will be expanded to include a general description of the alternative being analyzed, a comprehensive list of cost and value assumptions, and assumptions regarding the risks associated with a specific alternative. This often expands the initial risk inventory.
As shown in Figure A8.7, the estimation of cost, value, and risk provide important data points for investment decision-making. However, when analyzing an alternative and making an investment decision, it is critical to understand the relationships among them.
A complete and valid cost estimate is critical to determining whether or not a specific alternative should be selected. It also is used to assess how much funding must be requested. Understating cost estimates to gain approval, or not considering all costs, may create doubt as to the veracity of the entire analysis. An inaccurate cost estimate might lead to cost overruns, create the need to request additional funding, or reduce scope.
The total cost estimate is calculated by aggregating expected values for each cost element.
ROI metrics express the relationship between the funds invested in an initiative and the financial benefits the initiative will generate. Simply stated, it expresses the financial “bang for the buck.” Although it is not considered the only measure on which an investment decision should be made, ROI is, and will continue to be, a critical data point for decision-making.
The value score quantifies the full range of value that will be delivered across the five value factors as defined against the prioritized measures within the decision framework. The interpretation of a value score will vary based on the level from which it is being viewed. At the program level, the value score will be viewed as a representation of how alternatives performed against a specific set of measures. They will be used to make an “apples-to-apples” comparison of the value delivered by multiple alternatives for a single initiative.
For example, the alternative that has a value score of 80 will be preferred over the alternative with a value score of 20, if no other factors are considered. At the organizational or portfolio level, value scores are used as data points in the selection of initiatives to be included in an investment portfolio. Since the objectives and measures associated with each initiative will vary, decision-makers at the senior level use value scores to determine what percentage of identified value an initiative will deliver. For example, an initiative with a value score of 75 is providing 75% of the possible value the initiative has the potential to deliver. In order to understand what exactly is being delivered, the decision-maker will have to look at the measures of the value structure.
Consider the value score as a simple math problem. The scores projected for each of the measures within a value factor should be aggregated according to their established weights. The weighted sum of these scores is a factor’s value score. The sum of the factors’ value scores, aggregated according to their weights, is the total value score.
After considering the probability and potential impact of risks, risk scores are calculated to represent a percentage of overall performance slippage or cost increase.
Risk scores provide decision-makers with a mechanism to determine the degree to which value and cost will be negatively affected and whether that degree of risk is acceptable based on the risk tolerance boundaries defined by senior staff. If a selected alternative has a high cost and/or high-value risk score, program management is alerted to the need for additional risk mitigation, project definition, or more detailed risk-management planning. Actions to mitigate the risk may include the establishment of a reserve fund, a reduction of scope, or a refinement of the alternative’s definition. Reactions to excessive risk may also include reconsideration of whether it is prudent to invest in the project at all, given the potential risks, the probability of their occurrence, and the actions required to mitigate them.
Tasks 1–4 of this step analyze and estimate the value, cost, and risk associated with an alternative. In isolation, each data point does not provide the depth of information required to ensure sound investment decisions.
Previous to the advent of VMM, only financial benefits could be compared with investment costs through the development of an ROI metric. When comparing alternatives, the consistency of the decision framework allows the determination of how much value will be received for the funds invested. Additionally, the use of risk scores provides insight into how all cost and value estimates are affected by risk.
By performing straightforward calculations, it is possible to model the relationships among value, cost, and risk
Regardless of the projected merits of an initiative, its success will depend heavily on the ability of its proponents to generate internal support, to gain buy-in from targeted users, and to foster the development of active leadership supporters (champions). Success or failure may depend as much on the utility and efficacy of an initiative as it does on the ability to communicate its value in a manner that is meaningful to stake-holders with diverse definitions of value. The value of an initiative can be expressed to address the diverse definitions of stakeholder value in funding justification documents and in materials designed to inform and enlist support.
Using VMM, the value of a project is decomposed according to the different value factors. This gives project-level managers the tools to customize their value proposition according to the perspective of their particular audience. Additionally, the structure provides the flexibility to respond accurately and quickly to project changes requiring analysis and justification.
The tasks and outputs associated with Step 4:
Leveraging the results of VMM analysis can facilitate relations with customers and stakeholders. VMM makes communication to diverse audiences easier by incorporating the perspectives of all potential audience members from the outset of analysis. Since VMM calculates the potential value that an investment could realize for all stakeholders, it provides data pertinent to each of those stakeholder perspectives that can be used to bolster support for the project. It also fosters substantive discussion with customers regarding the priorities and detailed plans of the investment. These stronger relationships not only prove critical to the long-term success of the project, but can also lay the foundation for future improvements and innovation.
Many organizations require comprehensive analysis and justification to support funding requests. IT initiatives may not be funded if they have not proved:
After completion of the VMM, one will have data required to complete or support completion of budget justification documents.
Once a VMM model is built to assimilate and analyze a set of investment alternatives, it can easily be tailored to support ad hoc requests for information or other reporting requirements. In the current, rapidly changing political and technological environment, there are many instances when project managers need to be able to perform rapid analysis. For example, funding authorities, agency partners, market pricing fluctuations, or portfolio managers might impose modifications on the details (e.g., the weighting factors) of a project investment plan; many of these parties are also likely to request additional investment-related information later in the project life cycle. VMM’s customized decision framework makes such adjustments and reporting feasible under short time constraints.
Lessons learned through the use of VMM can be a powerful tool when used to improve overall organizational decision-making and management processes. For example, in the process of identifying metrics, one might discover that adequate mechanisms are not in place to collect critical performance information. Using this lesson to improve measurement mechanisms would give an organization better capabilities for (a) gauging the project’s success and mission-fulfillment, (b) demonstrating progress to stakeholders and funding authorities, and (c) identifying shortfalls in performance that could be remedied.
Conducting an uncertainty analysis requires the following:
Sensitivity analysis is used to identify the business drivers that have the greatest impact on potential variations of an alternative’s cost and its returned value. Many of the assumptions made at the beginning of a project’s definition phase will be found inaccurate later in the analysis. Therefore, one must consider how sensitive a total cost estimate or value projection is to changes in the data used to produce the result. Insight from this analysis allows stakeholders not only to identify variables that require additional research to reduce uncertainty, but also to justify the cost of that research.
The information required to conduct a sensitivity analysis is derived from the same Monte Carlo simulation used for the uncertainty analysis.
Figure A8.9 is a sample sensitivity chart. Based on this chart, it is clear that “Build 5/6 Schedule Slip” is the most sensitive variable.
Net Present Value = [PV(Internal Project Cost Savings, Operational) +PV(Mission Cost Savings)] − PV(Initial Investment)
* This appendix is based on the Value Measuring Methodology—How-To-Guide. The U.S. Chief Information Officers Council.