Appendix X: Metrics Guide for Knowledge Management Initiatives

The key control over operations (KCO) model uses three types of specific measures to monitor the knowledge management (KM) initiative from different perspectives. Outcome metrics concern the overall organization and measure large-scale characteristics such as increased productivity or revenue for the enterprise. Output metrics measure project-level characteristics such as the effectiveness of lessons-learned information in solving problems. System metrics monitor the usefulness and responsiveness of the supporting technology tools.

  • System measures relate the performance of the supporting information technologies to the KM initiative. They give an indirect indication of knowledge sharing and reuse, but can highlight which assets are the most popular and any usability problems that might exist and limit participation. For example, the Virtual Naval Hospital uses measures of the number of successful accesses, pages read, and visitors to monitor the viability of the information provided.
  • Output measures measure direct process output for users and give a picture of the extent to which personnel are drawn to and actually using the knowledge system. For example, some companies evaluate “lesson reuse” to ensure that the lessons they are maintaining are valuable to users.
  • Outcome measures determine the impact of the KM project on the organization and help determine if the knowledge base and knowledge transfer processes are working to create a more effective organization. Outcome measures are often the hardest measures to evaluate, particularly because of the intangible nature of knowledge assets. Some of the best examples of outcome measures are in the private sector. For example, energy giant Royal Dutch/Shell Group reports that ideas exchanged in their community of practice for engineers saved the company $200 million in 2000 alone. In one example, communication on the community message board led to approximately $5 million in new revenue when the engineering teams in Europe and the Far East helped a crew in Africa solve a problem they had previously attempted to resolve.

How Should We Collect and Analyze the Measures?

As you identify the measures that you will use for your KM initiative, you will also need to identify a process for collecting these measures. The important element is to structure information gathering and to probe deep enough to understand how decisions are made and the information that measures can provide to help decisions.

For system measures, look for automated data collection systems, such as tools that measure website accesses and “wait times.” System performance logs will also provide valuable system measures.

For output and outcome measures, you may end up relying on manual counts, estimates, or surveys. Though surveys are considered a source of soft data because they measure perceptions and reactions, they can be quantitative. For example, a survey might ask the user to respond to a statement using a “1 to 5” Likert scale (where 1 means “strongly disagree” and 5 means “strongly agree”). Survey data can also be useful to capture and summarize qualitative information such as comments and anecdotes. One consulting firm used contests with prizes to encourage members of communities of practice to contribute anecdotes describing how being a member of the community helped them accomplish a measurable objective for the firm (such as saving time or money, or generating new revenue). Surveys can be conducted in person, by telephone, and/or in written form. Written surveys can be transmitted by mail, e-mail, or on a website. Surveys can have a dual purpose: they not only collect useful information but they also help educate the survey taker by raising his or her awareness of key issues or critical success factors for the initiative.

Other techniques that can be useful include the following:

  • Interviews or workshops

    Stakeholders can be interviewed individually or through a group setting in a facilitated workshop to draw out opinions and generate group consensus. The best choice depends on the people, organizational culture, the information needed, and people’s availability. In each case, it is important to structure the sessions proactively. Merely asking people what information they would like is unlikely to yield useful results. Facilitation of any session is recommended to urge managers to talk about the type of decisions they commonly make and what decision-making information would be useful by asking “what if” questions.

  • Structured program flows

    Tracing the flow of the program capabilities, the uses of these capabilities by direct users, and the benefits to the end user is another way to identify the information desired from performance measures. This flow-tracing technique is particularly useful for programs for which it is difficult to directly identify or calculate measures for the ultimate end-user benefits.

  • Organization documents

    Documents contain useful information regarding an organization’s goals, priorities, measures, problems, and business operations.

  • Meetings involving the performing organization and stakeholders

    Many organizations comprise steering committees of representative internal and external stakeholders. Observing the interchange at meetings can yield the priorities and issues that the stakeholders believe are important.

Once the measures have been collected, they should be analyzed within the framework chosen earlier. This will ensure that the measures are correlated to the objectives of the initiative and aligned with the strategic goals of the organization. In particular, explicitly note whether the measures give a direct or indirect indication of effects so that your team and stakeholders do not misconstrue or have unrealistic expectations of performance.

What Do the Measures Tell Us and How Should We Change?

This is one of the most critical steps in the measurement process as well as in the entire KCO implementation process. The complex and dynamic nature of KM makes it extremely difficult to devise a plan in the preplanning phase that will not later need to be changed. Use the framework to help elucidate what you can discover about the effectiveness and participation of stakeholders in the KM project. Are they using the knowledge? Are people sharing meaningful knowledge openly? Have people participated during the rollout while there was a great deal of fanfare and then stopped? Are there any anecdotes showing that people became more efficient or solved a problem faster because of the knowledge?

For all of these questions and your other indicators, ask why it happened or had that response. Even without a firm answer, the search for an answer will most likely yield valuable insights and ideas on how to improve your KM project. Collect and prioritize these new ideas and go back to your original plans and assumptions to see if they need to be changed. It is normal that several measures will need to be modified. This is a good time to assemble your team and build a consensus on what should be changed, how to change it, and when to introduce the changes. Also, you should update the measures and framework to make sure they are tightly coupled to your new KM plans.

Program and Process Management

This section discusses classes of business objectives that share a common need for understanding the current and future performance of programs relating to their requirements. These requirements span a range of development objectives and milestone dates, financial constraints, resource needs and usage, alignment with organizational strategic plans, and adherence to legal, environmental, and safety regulations and laws.

Business Applications

The program and process management business area concerns monitoring and guiding business tasks to ensure they achieve development, financial, and resource objectives. In addition, this area includes business development activities where people need to identify and assess opportunities, determine their customers’ key interests and funding levels, and obtain business intelligence on competitor capabilities and plans. You should read this section if you are applying KM to the following or similar activities

  • Program management
  • Project control
  • Business process reengineering
  • Quality management
  • Strategic planning
  • Policy and standards definition
  • Integrated product teams
  • Architecture design and review
  • Plan of action and milestones (POAM)
  • Budgeting
  • Business development
  • Business intelligence
  • Enterprise resource planning (ERP)
  • Customer relationship management (CRM)

The primary KM objectives of these types of activities are to

  • Create a consistent understanding across the organization of key issues, such as standardized methods, policies, and goals and objectives
  • Improve business development
  • Increase effectiveness, productivity, and quality
  • Implement best practices
  • Share and reuse lessons learned

Some examples of KM initiatives for program and process management are

  • Experienced program managers have learned how to substantially reduce the time they spend reporting their programs to different sponsors, each of which has a different format and set of regulations. This knowledge can help junior program managers be more efficient and provide a higher level of service to their customers. A community of practice is established to enable junior and senior program managers to informally interact and share information on their projects and methods. A special component is the mentor’s corner, which includes a series of video interviews in which the experienced managers explain their key insights and methods.
  • Near the end of every fiscal year, key leaders must stop working on their daily projects for 5 days to answer urgent requests for consolidated status reports by Congress. Most of this time is spent finding the proper people who can explain current and projected data. This serious disruption to operations can be reduced to one half day with a current listing of points of contact for key projects. Thus, an experts’ directory that is validated and kept up to date is developed.

Performance Measures

KM metrics should be extensively correlated to as many factors influencing the results as possible. Since there are many forces within an organization affecting people’s learning, sharing, and efficiency, it is difficult to separate the effects of the KM processes from other processes. The KM measures should be used as a body of evidence to support analysis and decision-making. As much as possible, the KM measures should be related to, or the same as, existing measures in the organization that are used to monitor the success of performing mission objectives.

Outcome measures

Examples of possible outcome measures include

  • Measure the change in resource costs (funds, time, personnel) used in a business process over time. To tie to the KM initiative, gauge this change against when the KM asset was made available and its usage, and to other business processes that are not part of the KM initiative. Also, include surveys of user attitudes and practices. For example, do the groups who regularly use and maintain a lessons-learned database spend less overhead funds than other groups? Do they say the lessons learned helped them?
  • Measure the success and failure rate of programs linked to the KM assets over time. For example, has the number of programs completed on time and within cost increased? For all groups, or mostly for groups actively engaged in the KM initiative?
  • Determine the number of groups meeting best practices criteria, and how long it took them to achieve this status versus the existence and use of the KM system. For example, did any groups entering a new business area reach an expert level much faster than usual by using the collected best practices and associated corporate learnings from the beginning of their project?
  • Gauge the “smartness” of the organization, that is, are more customers commenting on the high level of expertise of different groups, or are more industry awards being won? Are these comments based on the ability of individual work groups presenting the capabilities of their colleagues as well as their own? How did these groups get the information?

Output Measures

Examples of possible output measures include

  • Conduct a survey to find out how useful people find the KM initiative. How have people used the collected knowledge? Was it valuable? Did it answer their questions and help solve their problems or was it merely another set of information to read and digest? How do they suggest improving the KM system?
  • Find examples of specific mistakes or problems that were avoided or quickly solved because of KM. These are typically uncovered by talking to people and collecting anecdotes. For example, did the lessons-learned database help someone immediately find out how to compute future estimated resource costs according to new regulations?
  • Determine how much new business is connected to using the sharing of expertise. For example, did someone win a new contract with a new customer because they watched the video interviews of business development experts in the mentor’s corner of the community of practice?
  • Measure the decrease in time required to develop program status reports. For example, do all managers of cross-functional programs have the same information on resource usage and development progress, as well as all problems encountered, with the responsible point of contact and its resolution?

System Measures

Examples of possible system measures include

  • Measure the statistics from the KM system. For example, how many times has the website been accessed? How many times have lessons learned or best practices files been downloaded?
  • Measure the activity of a community of practice. For example, how many members are in the community and how often do they interact? How long has it been since the last contribution to a shared repository or threaded discussion? What percentage of total members are active contributors?
  • • How easy is it for people to find the information they want? Conduct a survey and test the site yourself. Find out how many responses are typically generated from a search. If this number is too high (greater than approximately 50), people may be giving up the search and not making use of the knowledge assets. Are the responses what the user wants to see? Check to see if the site is easy to navigate with an organizational structure consistent with the way users work and think about the information. What is the system latency, that is, the wait time between a user requesting something and when the system delivers it?
  • Measure how frequently the knowledge assets are updated. Are the best practices outdated and superseded by new versions? Are the points of contact no longer working on the project? Is there a listed update time that has been exceeded? Are a large number of links to experts no longer valid?

Program Execution and Operations

This section discusses classes of business objectives that share a common need for efficiently performing work tasks in a timely manner. These tasks commonly require extensive training and experience, are complex, and can be dangerous.

Business Applications

The program execution and operations business area concerns the activities involved in performing a program’s Statement of Work; designing, building, testing, evaluating, installing, and maintaining systems; controlling real-time operations; and other tasks focused on developing and performing tangible products and services. This knowledge must be implementable and practical, and typically includes highly detailed procedures, facts, and analyses. Consequently, this business area involves a substantial amount of tacit knowledge—that is, the unspoken knowledge people build through experience, which is not always easy to articulate. For example, a master electrician knows many characteristics of power systems that a novice electrician does not, making the master electrician many times more productive and efficient on complex tasks. This knowledge is commonly transferred during apprentice, mentoring, and educational relationships. You should read this section if you are applying KM to the following or similar activities:

  • Maintenance
  • Engineering design
  • Research and development
  • Manufacturing
  • Test and evaluation
  • Logistics
  • Operations management
  • • Software development
  • Hardware and software installation
  • Construction
  • Demolition

The primary KM objectives of these types of activities are to

  • Increase effectiveness, productivity, and quality
  • Implement best practices
  • Share and reuse lessons learned
  • Accelerate learning
  • Maintain, share, and leverage expertise
  • Facilitate team collaboration

Some examples of KM initiatives for program execution and operations are

  • An engineering design team includes members from many organizations located globally. The entire team is only able to meet in person twice a year at the formal program reviews. In order to avoid redundant efforts and wasting the team’s high level of complementary expertise, a distributed collaborative web-based work environment is created where all project information is posted and informal online work sessions occur with file sharing, white-boards, video, and speech. Since this is the team’s official news source and work center, everyone is confident that they will find valuable information whenever they enter the environment.
  • A construction organization is faced with many of their senior members retiring in the next couple of years. A great deal of the organization’s expertise and success depends on the workers’ knowledge built over their long careers. A lessons-learned database is created where the senior experts are asked to describe their key thoughts on doing their work. The lessons learned are collected in both text and video formats and posted on the organization’s intranet.

Performance Measures

KM metrics should be extensively correlated to as many factors influencing the results as possible. Since there are many forces within an organization affecting people’s learning, sharing, and efficiency, it is difficult to separate the effects of the KM processes from other processes. Thus, the KM measures should be used as a body of evidence to support analysis and decision-making. As much as possible, the KM measures should be related to or the same as existing measures in the organization that are used to monitor the success of performing mission objectives.

Outcome Measures

Examples of possible outcome measures include

  • Measure the change in resource costs (funds, time, personnel) used in a program over time. To tie this to the KM initiative, gauge this against when the KM asset was made available and its usage, and to other programs that are not part of the KM initiative. Also include surveys of user attitudes and practices. For example, have maintenance costs decreased and have average readiness rates increased? Do the technicians say that the lessons-learned database and the community of practice help them get answers? How have they used these lessons to affect their work? Remember that collecting these experience stories serves the dual purpose of performance measurement and “advertising” the KM initiative.
  • Calculate the total life-cycle cost. Has it decreased more than other projects that are not using KM?
  • Assess risks to changes in business environment or mission objectives. Is the organization aware of its risks and does it have contingency plans prepared? Have these included the expertise of the workers as well as management? Have the KM processes and systems helped develop and review these plans?
  • Measure the number of cross-functional teams, both formal and informal. Are the teams working together and sharing? Are the teams ahead of schedule and do they have fewer mistakes? What do the team members say about their ability and willingness to openly share critical knowledge? Is there knowledge hoarding because of internal competition?

Output Measures

Examples of possible output measures include

  • Conduct a survey to find out how useful people find the KM initiative. How have people used the collected knowledge? Was it valuable? Did it answer their questions and help solve their problems, or was it merely another set of information to read and digest? How do they suggest improving the KM system?
  • Find examples of specific mistakes or problems that were avoided or quickly solved because of KM. These are typically uncovered by talking to people and collecting anecdotes. Was a costly or time-consuming manufacturing problem fixed by using the lessons-learned database? Have experts been contacted from the expertise directory? Were they consulted during a task to answer detailed questions?
  • Measure how quickly and precisely people can find information on the KM system. Do people have to sort through a large volume of information or are there succinct prepackaged synopses available? Is there active and continuous content management that distills and validates critical information into synopses? Was an engineering team able to find, fill out, and submit all required regulatory forms within 10 min, 1 h, 1 day, 1 week, and so on, and was this faster or slower than before the KM system was implemented?

System Measures

Examples of possible system measures include:

  • Measure the statistics from the KM system. How many times has the website been accessed? How many times have lessons learned or best practices files been downloaded?
  • Measure the activity of a community of practice. How many members are in the community, and how often do they interact? How long has it been since the last contribution to a shared repository or threaded discussion? What percentage of total members are active contributors?
  • How easy is it for people to find the information they want? Conduct a survey and test the site yourself. How many responses are typically generated from a search? If this number is too high (greater than approximately 50), then people may be giving up the search and not making use of the knowledge assets. Are the responses what the user wants to see? Is the site easy to navigate with an organizational structure consistent with the way they do work and think about the information? What is the system latency, that is, the wait time between a user requesting something and when the system delivers it?
  • Measure how frequently the knowledge assets are updated. Are the best practices outdated and superseded by new versions? Are the points of contact no longer working on the project? Is there a listed update time that has been exceeded? Are a large number of links to experts no longer valid?

Personnel and Training

This section describes classes of business objectives that share a common focus on helping people coordinate and decide professional and personal issues that affect their income, jobs, careers, retirement, education, and families, and other quality of life topics.

Business Applications

The personnel and training business area concerns activities for human resources, continuing education, personal life issues, and quality of life. These applications focus on helping people improve the effectiveness or quality of their work life and helping organizations attract and retain talent. These activities share a common need for people to determine what options are available from various programs, how they impact their personal finances and families, what experiences other people have had (good and bad) with these options, who to contact to make arrangements, and what they are required to do for the programs. You should read this section if you are applying KM to the following or similar activities:

  • Human resources
  • Distance or e-learning and continuing education
  • Fringe benefits management
  • Career planning
  • Employee retention
  • Relocation

The primary KM objectives of these types of activities are to

  • Provide retirement, health, and financial services
  • Arrange for moving jobs and families to new locations
  • Plan career growth
  • Enhance learning opportunities
  • Improve quality of life
  • Retain and attract employees

Some examples of KM initiatives for personnel and training are

  • An employee is relocating to a new area. Without an opportunity to visit the new location, the Marine’s family has to find a home, change banks, arrange for daycare and school, and notify the utility, telephone, and cable companies in both locations. Logging into the relocation community of practice website, the employee finds links to local information and directories at the new base, and suggestions from people who live there on the best places to live, local daycare centers, and how to enroll children for school and how to sign up for utilities.
  • Employees are encouraged to take continuing education courses through the Internet offered by several authorized institutions. They can access their personnel records to see what courses they need for various job positions and promotions. As they take an online course, their progress is automatically noted in their personnel records and sent to their supervisor to be included in their performance reviews.
  • Employees can access their fringe benefit plans through the human resources department’s website. They can change their options during open season and compare the cost and benefits offered by retirement and health plans using the website’s interactive feature comparison application. In addition, a lessons-learned database includes key issues discussed by experts on these plans.

Performance Measures

KM metrics should be extensively correlated to as many factors influencing the results as possible. Since there are many forces within an organization affecting people’s learning, sharing, and efficiency, it is difficult to separate the effects of the KM processes from other processes. Thus, the KM measures should be used as a body of evidence to support analysis and decision-making. As much as possible, the KM measures should be related to, or the same as, existing measures in the organization that are used to monitor the success of performing mission objectives.

Outcome Measures

Examples of possible outcome measures include

  • Measure the change in resource costs (funds, time, personnel) used in a business process over time. To tie this to the KM initiative, gauge this against when the KM asset was made available and its usage, and to other business processes that are not part of the KM initiative. Also include surveys of user attitudes and practices. Has the cost of administering human resource programs decreased? Have user surveys shown a higher level of satisfaction?
  • Conduct a survey to find out how satisfied people are in their job. Are people happy with their health and retirement plans? Do they feel they have good opportunities to learn new skills and subjects? Are they satisfied with their career advancement opportunities? Have these values changed since the KM initiative started?

Measure retention rates and the cost of attracting new people. Are fewer people leaving the organization for other jobs? Are starting salaries stable or are they and other benefits rising to compete with other organizations?

Output Measures

Examples of possible output measures include

  • Conduct a survey to find out how useful people find the KM initiative. How have people used the collected knowledge? Was it valuable? Did it answer their questions and help solve their problems, or was it merely another set of information to read and digest? How do they suggest improving the KM system?
  • Find examples of specific mistakes or problems that were avoided or quickly solved because of KM. These are typically uncovered by talking to people and collecting anecdotes. Have fewer people needed help properly filing their change orders? Are people able to easily locate new housing and services in their new locations? Are people able to find people through the KM systems to help them with local details?
  • Measure the usage of distance learning system. Are employees taking only required courses or courses for career advancement as well?

System Measures

Examples of possible system measures include:

  • Measure the statistics from the KM system. How many times has the website been accessed?
  • Measure the activity of a community of practice. How many members are in the community, and how often do they interact? How long has it been since the last contribution to a shared repository or threaded discussion? What percentage of total members are active contributors?
  • How easy is it for people to find the information they want? Conduct a survey and test the site yourself. How many responses are typically generated from a search? If this number is too high (greater than approximately 50), then people may be giving up the search and not making use of the knowledge assets. Are the responses what the user wants to see? Is the site easy to navigate with an organizational structure consistent with the way they do work and think about the information? What is the system latency, that is, the wait time between a user requesting something and when the system delivers it?
  • Measure how frequently the knowledge assets are updated. Are the best practices out of date and superseded by new versions? Are the points of contact no longer available? Is there a listed update time that has been exceeded? Are a large number of links to experts no longer valid?

Appendix A Summary of KM Performance Measures

COMMON MEASURES: THESE MEASURES CAN BE USED FOR ALL KM INITIATIVES:
Outcome System
  • Time, money, or personnel time saved as a result of implementing initiative
  • Percentage of successful programs compared to those before KM implementation
  • Latency (response times)
  • Number of downloads
  • Number of site accesses
  • Dwell time per page or section
  • Usability survey
  • Frequency of use
  • Navigation path analysis
  • Number of help desk calls
  • Number of users
  • Frequency of use
  • Percentage of total employees using system
Output
  • Usefulness surveys where users evaluate how useful initiatives have been in helping them accomplish their objectives
  • Usage anecdotes where users describe (in quantitative terms) how the initiative has contributed to business objectives

KM INITIATIVE KEY SYSTEM MEASURES KEY OUTPUT MEASURES KEY OUTCOME MEASURES
Best practice directory
  • Number of downloads
  • Dwell time
  • Usability survey
  • Number of users
  • Total number of contributions
  • Contribution rate over time
  • Usefulness survey
  • Anecdotes
  • User ratings of contribution value
  • Time, money, or personnel time saved by implementing best practices
  • Number of groups certified in the use of the best practice
  • Rate of change in operating costs
Lessons-learned database
  • Number of downloads
  • Dwell time
  • Usability survey
  • Number of users
  • Total number of contributions
  • Contribution rate over time
  • Time to solve problems
  • Usefulness survey
  • Anecdotes
  • User ratings of contribution value
  • Time, money, or personnel time saved by applying lessons learned from others
  • Rate of change in operating costs
Communities of practice or special interest groups
  • Number of contributions
  • Frequency of update
  • Number of members
  • Ratio of the number of members to the number of contributors (conversion rate)
  • Number of “apprentices” mentored by colleagues
  • Number of problems solved
  • Savings or improvement in organizational quality and efficiency
  • Captured organizational memory
  • Attrition rate of community members versus non-member cohort
Expert or expertise directory
  • Number of site accesses
  • Frequency of use
  • Number of contributions
  • Contribution/update rate over time
  • Navigation path analysis
  • Number of help desk calls
  • Time to solve problems
  • Number of problems solved
  • Time to find expert
  • Savings or improvement in organizational quality and efficiency
  • Time, money, or personnel time saved by leveraging expert's knowledge or expertise knowledge base

Based on the Department of the Navy’s “Metrics guide for knowledge management initiatives,” published in 2001.

KM INITIATIVE KEY SYSTEM MEASURES KEY OUTPUT MEASURES KEY OUTCOME MEASURES
Portal
  • Searching precision and recall
  • Dwell time
  • Latency
  • Usability survey
  • Common awareness within teams
  • Time spent "gathering" information
  • Time spent “analyzing” information
  • Time, money, or personnel time saved as a result of portal use
  • Reduced training time or learning curve as a result of single access to multiple information sources
  • Customer satisfaction (based on the value of self-service or improved ability for employees to respond to customer needs)
Lead tracking system
  • Number of contributions
  • Frequency of update
  • Number of users
  • Frequency of use
  • Navigation path analysis
  • Number of successful leads
  • Number of new customers and value from these customers
  • Value of new work from existing customers
  • Proposal response times
  • Proposal "win" rates
  • Percentage of business developers who report finding value in the use of the system
  • Revenue and overhead costs
  • Customer demographics
  • Cost and time to produce proposals
  • Alignment of programs with strategic plans
Collaborative systems
  • Latency during collaborative process
  • Number of users
  • Number of patents/ trademarks produced
  • Number of articles published plus number of conference presentations per employee
  • Number of projects collaborated on
  • Time lost due to program delays
  • Number of new products developed
  • Value of sales from products created in the last 3-5 years (a measure of innovation)
  • Average learning curve per employee
  • Proposal response times
  • Proposal “win” rates
  • Reduced cost of product development, acquisition, or maintenance
  • Reduction in the number of program delays
  • Faster response to proposals
  • Reduced learning curve for new employees
Yellow Pages
  • Number of users
  • Frequency of use
  • Latency
  • Searching precision and recall
  • Time to find people
  • Time to solve problems
  • Time, money, or personnel time saved as a result of the use of yellow pages
  • Savings or improvement in organizational quality and efficiency
e-learning systems
  • Latency
  • Number of users
  • Number of courses taken per user
  • Training costs
  • Savings or improvement in organizational quality and efficiency
  • Improved employee satisfaction
  • Reduced cost of training
  • Reduced learning curve for new employees

Based on the Department of the Navy’s ’Metrics guide for knowledge management initiatives,” published in 2001.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset