The purpose of Measurement and Analysis is to develop and sustain a measurement capability that is used to support management information needs for managing the operational resilience management system.
Consistent, timely, and accurate measurements are important feedback for managing any activity. Measurement and Analysis represents a means for applying metrics, measurement, and analysis to the resilience equation. This process area represents the organization’s application of measurement as a foundational activity to provide data and analysis results that can be effectively used to inform and improve the management of the resilience process.
In the Measurement and Analysis process area, the organization establishes the objectives for measurement (i.e., what it intends to accomplish) and determines the measures that would be useful to managing the operational resilience management system as well as to providing meaningful data to higher-level managers for the processes of governance, compliance, monitoring, and improvement. The organization collects relevant data, analyzes this data, and provides reports to managers and other stakeholders to support decision making.
In a generic sense, the measurement and analysis process includes the following activities and objectives:
• specifying the objectives of measurement and analysis such that they are aligned with identified information needs and objectives
• specifying the measures, analysis techniques, and mechanisms for data collection, data storage, reporting, and feedback
• implementing the collection, storage, analysis, and reporting of the data
• providing objective results that can be used in making informed decisions, and taking appropriate corrective actions
Integrating measurement and analysis into the operational resilience management system supports
• planning, estimating, and executing operational resilience management activities
• tracking the performance of operational resilience management activities against established plans and objectives, including resilience requirements
• identifying and resolving issues in operational resilience management processes
• providing a basis for incorporating measurement into additional operational resilience management processes in the future
Measurement and analysis activities are often most effective when focused at the organizational unit or line of business level. Since operational resilience management is an enterprise-wide concern, however, it’s important for the enterprise to have mechanisms in place to make use of that local data at the enterprise level, particularly as the enterprise matures. Repositories for measurement data at the organizational unit or line of business level will be useful for local optimization, but as data is shared across organizational units for the overall improvement benefit of the enterprise, measurement repositories may also be needed at the enterprise level.
Measurement and analysis needs are informed by the organization’s governance activities, which are addressed in the Enterprise Focus process area.
Some of the data specified for Measurement and Analysis may be gathered and distributed through processes described in the Monitoring process area.
Measurements may be necessary as evidence of compliance; compliance requirements are addressed in the Compliance process area.
Measurement objectives and activities are aligned with identified information needs and objectives.
Measurement activities should provide needed information to the organization’s resilience management process and program. Failure to design the measurement and analysis activities in consideration of the organization’s needs may lead to inconsistent, incomplete, or inaccurate data collection, inappropriate use or disclosure of measurement data, or inefficient use of measurement resources.
The specific practices covered under this goal may be addressed concurrently or in any order:
• When establishing measurement objectives, experts often think ahead about necessary criteria for specifying measures and analysis procedures. They also think concurrently about the constraints imposed by data collection and storage procedures.
• It often is important to specify the essential analyses that will be conducted before attending to details of measurement specification, data collection, or storage.
Measurement objectives are established and maintained based on information needs and objectives.
Measurement objectives document the purposes for which measurement and analysis are done and specify the kinds of actions that may be taken based on the results of data analyses.
The sources for measurement objectives may be management, technical, asset, or process implementation needs.
The measurement objectives may be constrained by existing processes, available resources, or other measurement considerations. Judgments may have to be made about whether the value of the results will be commensurate with the resources devoted to doing the work.
Modifications to identified information needs and objectives may, in turn, be indicated as a consequence of the process and results of measurement and analysis.
Sources of information needs and objectives may be identified at the organizational unit level or at the enterprise level and may include the following:
• monitoring of operational resilience management system performance
• documented management objectives and resilience strategies
• requirements for protecting and sustaining high-value organizational assets and associated services
• risk conditions currently under management consideration
• process improvement objectives and process performance targets
• contractual, legal, and compliance obligations
• supply chain monitoring, including the resilience program status both upstream and downstream
• industry benchmarks
• interviews with managers and other stakeholders that have special information needs
The development and management of resilience requirements are addressed in the Resilience Requirements Definition and Resilience Requirements Management process areas, respectively. These requirements should be considered in the development of measurement objectives.
(Measurement objectives may overlap with monitoring requirements in that monitoring requirements typically represent information needs that are useful for process control and management. Practice MON:SG1.SP3 focuses on the development of monitoring requirements and should be considered as a source of information for the development of measurement objectives.)
Typical work products
Subpractices
Information needs and objectives are documented to allow traceability to subsequent measurement and analysis activities. (Refer to MON:SG1.SP3 for information about establishing monitoring requirements that may overlap with measurement information needs and goals.)
It may be neither possible nor desirable to subject all initially identified information needs to measurement and analysis. Priorities may also have to be set within the limits of available resources. (Refer to MON:SG1.SP4 for information about the prioritization of monitoring requirements.)
It is important to carefully consider the purposes and intended uses of measurement and analysis.
The measurement objectives are documented, reviewed by managers and other relevant stakeholders, and updated as necessary. Doing so enables traceability to subsequent measurement and analysis activities and helps ensure that the analyses will properly address identified information needs and objectives.
It is important that users of measurement and analysis results be involved in setting measurement objectives and deciding on plans of action. It may also be appropriate to involve those who provide the measurement data. (Refer to MON:SG1.SP2 for information about the establishment of monitoring stakeholders and their inclusion in the monitoring process. These stakeholders may also provide information to, or receive information from, the measurement and analysis process.)
Identified information needs and objectives may have to be refined and clarified as a result of setting measurement objectives. Initial descriptions of information needs may be unclear or ambiguous. Conflicts may arise between existing needs and objectives. Precise targets on an already existing measure may be unrealistic.
There must always be a good explanation for why a measurement is being analyzed.
Of course, the measurement objectives may also change to reflect evolving information needs and objectives.
The measures necessary to meet measurement objectives are established.
Measurement objectives are refined into precise, quantifiable measures.
Measures may be either “base” or “derived.” Data for base measures is obtained by direct measurement. Data for derived measures comes from other data, typically by combining two or more base measures.
Derived measures typically are expressed as ratios, composite indices, or other aggregate summary measures. They are often more quantitatively reliable and meaningfully interpretable than the base measures used to generate them.
Typical work products
Subpractices
The measurement objectives are refined into specific measures. The identified candidate measures are categorized and specified by name and unit of measure.
Specifications for measures may already exist, perhaps established for other purposes earlier or elsewhere in the organization.
Operational definitions are stated in precise and unambiguous terms. They address two important criteria:
• Communication—What has been measured, how was it measured, what are the units of measure, and what has been included or excluded?
• Repeatability—Can the measurement be repeated, given the same definition, to get the same results?
Proposed specifications of the measures are reviewed for their appropriateness with potential end users and other relevant stakeholders. Priorities are set or changed, and specifications of the measures are updated as necessary.
The techniques for collecting and storing measurement data are specified.
Explicit specification of collection methods helps ensure that the right data is collected properly. It may also aid in further clarifying information needs and measurement objectives.
Proper attention to storage and retrieval procedures helps ensure that data is available and accessible for future use and that the information is adequately protected and sustained according to applicable resilience requirements.
(Monitoring activities, particularly for the collection, storage, and distribution of data, may overlap significantly with MA:SG1.SP3. Specifically, MON:SG2.SP1, Establish and Maintain Monitoring Infrastructure, MON:SG2.SP3, Collect and Record Information, and MON:SG2.SP4, Distribute Information, may all be useful for achieving MA:SG1.SP3 if the information being monitored for and collected is related directly to measurement and analysis activities.)
Typical work products
Subpractices
Existing sources of data may already have been identified when specifying the measures. Appropriate collection mechanisms may exist whether or not pertinent data has already been collected.
Explicit specifications are made for how, where, and when the data will be collected. Procedures for collecting valid data are specified. The data is stored in an accessible manner for analysis, and it is determined whether it will be saved for possible re-analysis or documentation purposes.
Practice MON:SG2.SP2 establishes information collection standards and parameters that may be useful for collecting measurement and analysis data. If measurement and analysis data is collected through a monitoring process, the collection specifications should be included in the standards and parameters.
Data collection and storage mechanisms are well integrated with other normal work processes. Data collection mechanisms may include manual or automated forms and templates. Clear, concise guidance on correct procedures is available to those responsible for doing the work. Training is provided as necessary to clarify the processes necessary for collection of complete and accurate data and to minimize the burden on those who must provide and record the data.
Automated support can aid in collecting more complete and accurate data.
However, some data cannot be collected without human intervention (e.g., customer satisfaction or other human judgments), and setting up the necessary infrastructure for other automation may be costly.
Practice MON:SG1.SP2 addresses the essential infrastructure necessary to meet data collection, storage, and distribution standards for monitoring purposes. This infrastructure and the related infrastructure requirements may overlap those of the measurement and analysis process.
Proposed procedures are reviewed for their appropriateness and feasibility with those who are responsible for providing, collecting, and storing the data. They also may have useful insights about how to improve existing processes or may be able to suggest other useful measures or analyses. (See MON:SG2.SP2 for related activities.)
Priorities may have to be reset based on the following:
• the importance of the measures
• the amount of effort required to obtain the data
Considerations include whether new forms, tools, or training would be required to obtain the data.
The techniques for analysis and reporting are specified.
Specifying the analysis procedures in advance ensures that appropriate analyses will be conducted and reported to address the documented measurement objectives (and thereby the information needs and objectives on which they are based). This approach also provides a check that the necessary data will in fact be collected.
For operational resilience management purposes, analysis methods and techniques are likely to be extensive and cover a wide range of disciplines.
Typical work products
Subpractices
Early attention should be paid to the analyses that will be conducted and to the manner in which the results will be reported. These should meet the following criteria:
• The analyses explicitly address the documented measurement objectives.
• Presentation of the results is clearly understandable by the audiences to whom the results are addressed.
Priorities may have to be set within available resources.
All of the proposed content and format are subject to review and revision, including analytic methods and tools, administrative procedures, and priorities. The relevant stakeholders consulted should include intended end users, sponsors, data analysts, and data providers.
Just as measurement needs drive data analysis, clarification of analysis criteria can affect measurement. Specifications for some measures may be refined further based on the specifications established for data analysis procedures. Other measures may prove to be unnecessary, or a need for additional measures may be recognized.
The exercise of specifying how measures will be analyzed and reported may also suggest the need for refining the measurement objectives themselves.
Criteria for evaluating the utility of the analysis might address the extent to which the following apply:
• The results are (1) provided on a timely basis, (2) understandable, and (3) used for decision making.
• The work does not cost more to perform than is justified by the benefits that it provides.
Criteria for evaluating the conduct of the measurement and analysis might include the extent to which the following apply:
• The amount of missing data or the number of flagged inconsistencies is beyond specified thresholds.
• There is selection bias in sampling (e.g., only satisfied end users are surveyed to evaluate end-user satisfaction, or only unsuccessful projects are evaluated to determine overall productivity).
• The measurement data are repeatable (e.g., statistically reliable).
• Statistical assumptions have been satisfied (e.g., about the distribution of data or about appropriate measurement scales).
Measurement results, which address identified information needs and objectives, are provided.
The primary reason for performing measurement and analysis is to address identified information needs and objectives. Measurement results based on objective evidence can help to monitor performance, achieve resilience plan obligations, fulfill compliance obligations, make informed management and technical decisions, and enable corrective actions to be taken.
Measurement data is collected consistent with measurement objectives.
The data necessary for analysis is obtained and checked for completeness and integrity.
Practice MON:SG2.SP3 specifically addresses the collection of monitoring data that may also include measurement data for the purposes of measurement and analysis.
Typical work products
Data is collected as necessary for previously used as well as for newly specified base measures.
Data that was collected earlier may no longer be available for reuse in existing databases, paper records, or formal repositories.
Values are newly calculated for all derived measures.
All measurements are subject to error in specifying or recording data. It is always better to identify such errors and to identify sources of missing data early in the measurement and analysis cycle.
Checks can include scans for missing data, out-of-bounds data values, and unusual patterns and correlation across measures. It is particularly important to do the following:
• Test and correct for inconsistency of categorizations made by human judgment (i.e., to determine how frequently people make differing categorization decisions based on the same information, otherwise known as “inter-coder reliability”).
• Empirically examine the relationships among the measures that are used to calculate additional derived measures. Doing so can ensure that important distinctions are not overlooked and that the derived measures convey their intended meanings (otherwise known as “criterion validity”).
Controls over information that are relevant to measurement and analysis are addressed in KIM:SG5.SP3 in the Knowledge and Information Management process area.
Measurement data is analyzed against measurement objectives.
The measurement data is analyzed as planned, additional analyses are conducted as necessary, results are reviewed with relevant stakeholders, and necessary revisions for future analyses are noted.
Typical work products
Subpractices
The results of data analyses are rarely self-evident. Criteria for interpreting the results and drawing conclusions should be stated explicitly.
The results of planned analyses may suggest (or require) additional, unanticipated analyses. In addition, they may identify needs to refine existing measures, to calculate additional derived measures, or even to collect data for additional base measures to properly complete the planned analysis. Similarly, preparing the initial results for presentation may identify the need for additional, unanticipated analyses.
It may be appropriate to review initial interpretations of the results and the way in which they are presented before distributing and communicating them more widely.
Reviewing the initial results before their release may prevent needless misunderstandings and lead to improvements in the data analysis and presentation.
Relevant stakeholders with whom reviews may be conducted include asset owners and custodians, resilience staff, vital management personnel, and data providers.
Valuable lessons that can improve future efforts are often learned from conducting data analyses and preparing results. Similarly, ways to improve measurement specifications and data collection procedures may become apparent, as may ideas for refining identified information needs and objectives.
Measurement data, analyses, and results are stored.
Storing measurement-related information enables the timely and cost-effective future use of historical data and results. The information also is needed to provide sufficient context for interpretation of the data, measurement criteria, and analysis results.
Information stored typically includes the following:
• measurement plans
• specifications of measures
• sets of data that have been collected
• analysis reports and presentations
The stored information contains or references the information needed to understand and interpret the measures and to assess them for reasonableness and applicability (e.g., measurement specifications used in different business units when comparing across business units).
Data sets for derived measures typically can be recalculated and need not be stored. However, it may be appropriate to store summaries based on derived measures (e.g., charts, tables of results, or report prose).
Interim analysis results need not be stored separately if they can be efficiently reconstructed.
The organization should determine whether to store data in a centralized manner at the enterprise level, in a decentralized manner at the organizational unit level, or some combination.
Measurement and analysis data may constitute an organizational asset that requires controls to ensure confidentiality, integrity, and availability. (Controls over information assets are addressed in the Knowledge and Information Management process area.)
Typical work products
Subpractices
Ways to prevent inappropriate use of the data and related information include controlling access to data and educating people on the appropriate use of information.
Specific controls over the confidentiality, integrity, and availability of measurement information are specified in goal KIM:SG4 in the Knowledge and Information Management process area.
The results of measurement and analysis activities are communicated to relevant stakeholders.
The results of the measurement and analysis process are communicated to relevant stakeholders in a timely and usable fashion to support decision making and assist in taking corrective action.
Relevant stakeholders include risk managers and higher-level managers, relevant resilience staff, asset owners and custodians, data analysts, and data providers.
Subpractices
Measurement results are communicated in time to be used for their intended purposes. Reports are unlikely to be used if they are distributed with little effort to follow up with those who need to know the results.
To the extent possible and as part of the normal way they do business, users of measurement results are kept personally involved in setting objectives and deciding on plans of action for measurement and analysis. The users are regularly kept apprised of progress and interim results.
Results are reported in a clear and concise manner appropriate to the methodological sophistication of the relevant stakeholders. They are understandable, easily interpretable, and clearly tied to identified information needs and objectives.
The data is often not self-evident to practitioners who are not measurement experts. Measurement choices should be explicitly clear about the following:
• how and why the base and derived measures were specified
• how the data was obtained
• how to interpret the results based on the data analysis methods that were used
• how the results address information needs
Refer to the Generic Goals and Practices document in Appendix A for general guidance that applies to all process areas. This section provides elaborations relative to the application of the Generic Goals and Practices to the Measurement and Analysis process area.
The operational resilience management system supports and enables achievement of the specific goals of the Measurement and Analysis process area by transforming identifiable input work products to produce identifiable output work products.
Perform the specific practices of the Measurement and Analysis process area to develop work products and provide services to achieve the specific goals of the process area.
Elaboration:
Specific practices MA:SG1.SP1 through MA:SG2.SP4 are performed to achieve the goals of the measurement and analysis process.
Measurement and analysis is institutionalized as a managed process.
Establish and maintain governance over the planning and performance of the measurement and analysis process.
Refer to the Enterprise Focus process area for more information about providing sponsorship and oversight to the measurement and analysis process.
Subpractices
Elaboration:
Elaboration:
Establish and maintain the plan for performing the measurement and analysis process.
Elaboration:
The plan for the measurement and analysis process should not be confused with measurement plans for collecting, analyzing, storing, and communicating specific measurement data as described in specific goal MA:SG2. The plan for the measurement and analysis process details how the organization will perform measurement and analysis, including the development of specific measurement plans.
Subpractices
Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process.
Subpractices
Staff assigned to the measurement and analysis process must have appropriate knowledge of the related processes being measured and the objectivity to perform measurement and analysis activities without concern for personal detriment and without the expectation of personal benefit.
Refer to the Organizational Training and Awareness process area for information about training staff for resilience roles and responsibilities.
Refer to the Human Resource Management process area for information about acquiring staff to fulfill roles and responsibilities.
Refer to the Financial Resource Management process area for information about budgeting for, funding, and accounting for measurement and analysis.
Elaboration:
Many of these tools, techniques, and methods should be available as applied to other aspects of organizational measurement and analysis. The intent here is to apply these to operational resilience management.
Assign responsibility and authority for performing the measurement and analysis process, developing the work products, and providing the services of the process.
Elaboration:
Specific practice MA:SG1.SP3 calls for determining responsibilities for data collection, storage, retrieval, and security. Specific practice MA:SG1.SP4 calls for identifying those responsible for data analysis and presentation.
Refer to the Human Resource Management process area for more information about establishing resilience as a job responsibility, developing resilience performance goals and objectives, and measuring and assessing performance against these goals and objectives.
Subpractices
Elaboration:
Refer to the External Dependencies Management process area for additional details about managing relationships with external entities.
Train the people performing or supporting the measurement and analysis process as needed.
Refer to the Organizational Training and Awareness process area for more information about training the people performing or supporting the process.
Refer to the Human Resource Management process area for more information about inventorying skill sets, establishing a skill set baseline, identifying required skill sets, and measuring and addressing skill deficiencies.
Subpractices
Elaboration:
Elaboration:
Place designated work products of the measurement and analysis process under appropriate levels of control.
Elaboration:
Identify and involve the relevant stakeholders of the measurement and analysis process as planned.
Several MA-specific practices address the involvement of stakeholders in the measurement and analysis process. For example, MA:SG1.SP1 calls for involving relevant stakeholders in the formulation of measurement objectives, and MA:SG1.SP2 calls for involving them in the prioritization and review of measurement specifications.
Subpractices
Elaboration:
Monitor and control the measurement and analysis process against the plan for performing the process and take appropriate corrective action.
Elaboration:
While this practice is self-referencing, practices in the Measurement and Analysis process area provide more information about measuring and analyzing operational resilience management processes that can also be applied to the measurement and analysis process.
Refer to the Monitoring process area for more information about the collection, organization, and distribution of data that may be useful for monitoring and controlling processes.
Refer to the Enterprise Focus process area for more information about providing process information to managers, identifying issues, and determining appropriate corrective actions.
Subpractices
Elaboration:
Elaboration:
Objectively evaluate adherence of the measurement and analysis process against its process description, standards, and procedures, and address non-compliance.
Elaboration:
Review the activities, status, and results of the measurement and analysis process with higher-level managers and resolve issues.
Refer to the Enterprise Focus process area for more information about providing sponsorship and oversight to the operational resilience management system.
Measurement and analysis is institutionalized as a defined process.
Establish and maintain the description of a defined measurement and analysis process.
Establishing and tailoring process assets, including standard processes, are addressed in the Organizational Process Definition process area.
Establishing process needs and objectives and selecting, improving, and deploying process assets, including standard processes, are addressed in the Organizational Process Focus process area.
Subpractices
Collect measurement and analysis work products and improvement information derived from planning and performing the process to support future use and improvement of the organization’s processes and process assets.
Elaboration:
Establishing the measurement repository and process asset library is addressed in the Organizational Process Definition process area. Updating the measurement repository and process asset library as part of process improvement and deployment is addressed in the Organizational Process Focus process area.
Subpractices