Appendix XV: Benchmarking Tutorial

Without proper information it is difficult, if not impossible, to initiate a proper benchmarking effort. Information gathered in this process—called data collection by planners and requirements elicitation by software developers—will enable the organization to develop valid measures against which it should be measured.

Interviewing

The most common method of gathering information is by interviewing people. Interviewing can serve two purposes at the same time. The first is a fact-finding mission to discover what each person’s goals and objectives are with respect to the project; and the second is to begin a communications process that enables one to set realistic expectations for the project.

There are a wide variety of stakeholders that can and should be interviewed. Stakeholders are those that have an interest in seeing this project successfully completed—that is, they have a stake in the project. Stakeholders include employees, management, clients, and benchmarking partners.

Employees

Interviews have some major obstacles to overcome. The interviewees may resist giving information out of fear, they may relate their perception of how things should be done rather than how they really do them, or they may have difficulty in expressing themselves. On the other hand, the analyst’s own mind-set may also act as a filter too. Interviewers sometimes have to set aside their own technical orientation and make the best effort that they can to put themselves in the position that the interviewee is in. This requires that the analyst develops a certain amount of empathy.

An interview outline should contain the following information:

  1. Name of interviewee
  2. Name of interviewer
  3. Date and time
  4. Objectives of interview—that is, what areas you are going to explore and what data you are going to collect
  5. General observations
  6. Unresolved issues and topics not covered
  7. Agenda—that is, introduction, questions, summary of major points, closing

Recommended guidelines for handling the employee interview process include:

  1. Determine the process type to be analyzed (tactical, strategic, hybrid).
  2. Make a list of departments involved in the process.
  3. For each department, either request or develop an organization chart that shows the departmental breakdown along with the name, extension, and list of responsibilities of each employee.
  4. Meet with the department head to request recommendations and then formulate a plan that details which employees are the best interview prospects. The “best” employees to interview are those (a) who are very experienced (i.e., senior) in performing their job function; (b) who may have come from a competing company and, thus, have a unique perspective; (c) who have had a variety of positions within the department or company.
  5. Plan to meet with employees from all units of the department. In some cases, you may find that interviewing several employees at a time is more effective than dealing with a single employee, as interviewing a group of employees permits them to bounce ideas off each other.
  6. If there are many employees within a departmental unit, it is not optimum to interview every one. It would be wrong to assume that the more people in a department, the higher the number of interviewees. Instead, sampling should be used. Sampling is used to (a) contain costs; (b) improve effectiveness; (c) speed up the data gathering process; and (d) reduce bias. Systems analysts often use a random sample. However, calculating a sample size based on population size and your desired confidence interval is more accurate. Rather than provide a formula and instructions on how to calculate a sample size, I direct the reader to the sample-size calculator that is located at http://www.surveysystem.com/sscalc.htm.
  7. Carefully plan your interview sessions. Prepare your interview questions in advance. Be familiar with any technical vocabulary your interview subjects might use.
  8. No meeting should last longer than an hour. A half hour is optimum. There is a point of diminishing returns with the interview process. Your interviewees are busy and usually easily distracted. Keep in mind that some of your interviewees may be doing this against their will.

Customers

Customers often have experiences with other vendors or suppliers and can offer insight into the processes that other companies use or that they have experienced.

Guidelines for interviewing customers include

  1. Work with the sales and/or marketing departments to select knowledgeable and cooperative customers.
  2. Prepare an adequate sample size as discussed in the prior section.
  3. Carefully plan your interview sessions. Prepare your interview questions in advance.

Companies and Consultants

Another source of potentially valuable information is from other companies in the industry and consultants who specialize in the process areas being examined. While consultants can be easily located and paid for their expert advice, it is wise to tread slowly when working with other companies that are current or potential competitors.

Guidelines for interviewing other companies include

  1. Work with senior management and marketing to create a list of potential companies to interview. This list should contain the names of trading partners, vendors (companies that your companies buy from), and competitors.
  2. Attend industry trade shows to meet and mingle with competitor employees and listen to speeches made by competitive companies.
  3. Attend trade association meetings; sit on policy and standards committees.

Suppliers

Suppliers of the products you are considering are also an important source of ideas. These suppliers know a great deal about how their products are being used in the processes you are examining.

Types of Questions

When interviewing anyone, it is important to be aware of how to ask questions properly. Open-ended questions are the best for gaining the most information because they do not limit the individuals to predefined answers. Other benefits of using open-ended questions include: puts the interviewee at ease, provides more detail, induces spontaneity, and it is far more interesting for the interviewee. Open-ended questions require more than a yes or no answer. An example of an open-ended question is “What types of problems do you see on a daily basis with the current process?” These questions allow individuals to elaborate on the topics and potentially uncover the hidden problems at hand that might not be discoverable with a question that requires a yes or no answer.

One disadvantage of open-ended questions is that they create lengthier interviews. Another disadvantage is that it is easy for the interview to get off track, and it takes an interviewer with skill to maintain the interview in an efficient manner.

Closed-ended questions are, by far, the most common questions in interviewing. They are questions that have yes and no answers and are utilized to elicit definitive responses.

Past-performance questions can be useful to determine past experiences with similar problems and issues. An example of how a past-performance question is used is, “In your past job how did you deal with these processes?”

Reflexive questions are appropriate for closing a conversation or moving it forward to a new topic. Reflexive questions are created with a statement of confirmation and adding a phrase such as: Don’t you? Couldn’t you? Or wouldn’t you?

Mirror questions are a subtle form of probing and are useful in obtaining additional detail on a subject. After the interviewee makes a statement, pause and repeat his or her statement back with an additional or leading question: “So, when this problem occurs, you simply move on to more pressing issues?”

Often, answers do not give the interviewer enough detail, so one follows the question with additional questions to prod the interviewee to divulge more details on the subject. For example:

  1. Can you give some more details on that?
  2. What did you learn from that experience?

Another, more subtle, prodding technique can be used by merely sitting back and saying nothing. The silence will feel uncomfortable, causing the interviewee to expand on his or her last statement.

Questionnaires/Surveys

If there are large numbers of people to interview, one might start with a questionnaire and then follow up with certain individuals that present unusual ideas or issues in the questionnaires. Survey development and implementation are composed of the following tasks, according to Creative Research Systems, makers of a software solution for survey creation (surveysolutions.com):

  1. Establish the goals of the project—what you want to learn
  2. Determine your sample—whom you will interview
  3. Choose interviewing methodology—how you will interview
  4. Create your questionnaire—what you will ask
  5. Pretest the questionnaire, if practical—test the questions
  6. Conduct interviews and enter data—ask the questions
  7. Analyze the data—produce the reports

Similar to interviews, questionnaires may contain closed-end or open-ended questions or a hybrid, which is a combination of the two.

Survey creation is quite an art form. Guidelines for the creation of a survey include

  1. Provide an introduction to the survey. Explain why it is important that participants respond to it. Thank them for their time and effort.
  2. Put all important questions first. It is rare that all questions will be responded to. Those filling out the survey often become tired or bored of the process.
  3. Use plenty of “white space.” Use an appropriate sized font (i.e., Arial), font size (i.e., at least 12), and do skip lines.
  4. Use nominal scales if you wish to classify things (i.e., What make is your computer? 1 = Dell, 2 = Gateway, 3 = IBM).
  5. Use ordinal scales to imply rank (i.e., How helpful was this class? 3 = not helpful at all, 2 = moderately helpful, 1 = very helpful).
  6. Use interval scales when you want to perform some mathematical calculations on the results (i.e., How helpful was this class?)

    Not useful at all Very useful
    1 2 3 4 5

Survey questions must be carefully worded. Ask yourself the following questions when reviewing each question:

  1. Will the words be uniformly understood?
    In general, use words that are part of the commonly shared vocabulary of the customers. For example,
    1. (poor) Rate the proficiencies of the personnel.
    2. (better) Personnel are knowledgeable.

  2. Do the questions contain abbreviations or unconventional phrases?
    Avoid these to the extent possible, unless they are understood by everyone and are the common way of referring to something. For example,
    1. (poor) Rate our walk-in desk.
    2. (better) Personnel at our front desk are friendly.

  3. Are the questions too vague?
    Survey items should be clear and unambiguous; if they are not, the outcome is difficult to interpret. Make sure you ask something that can truly be measured. For example,
    1. (poor) This library should change its procedures.
    2. (better) Did you receive the information you needed?

  4. Are the questions too precise?
    Sometimes, the attempt to avoid vagueness results in items being too precise and customers may be unable to answer them. For example,
    1. (poor) Each time I visit the library, the waiting line is long.
    2. (better) Generally, the waiting line in the library is long.

  5. Are the questions biased?
    Biased questions influence the customer to respond in a manner that does not correctly reflect his/her opinion. For example,
    1. (poor) How much do you like our library?
    2. (better) Would you recommend our library to a friend?

  6. Are the questions objectionable?
    Usually, this problem can be overcome by asking the question in a less direct way.
    For example,
    1. (poor) Are you living with someone?
    2. (better) How many people, including yourself, are in your household?

  7. Are the questions double-barreled?
    Two separate questions are sometimes combined into one. The customer is forced to give a single response and this, of course, would be ambiguous. For example,
    1. (poor) The library is attractive and well maintained.
    2. (better) The library is attractive.

  8. Are the answer choices mutually exclusive?
    The answer categories must be mutually exclusive and the respondent should not feel forced to choose more than one. For example,
    1. (poor) Scale range: 1, 2–5, 5–9, 9–13, 13 or over
    2. (better) Scale range: 0, 1–5, 6–10, 11–15, 16 or over

  9. Are the answer choices mutually exhaustive?
    The response categories provided should be exhaustive. They should include all the possible responses that might be expected. For example,
    1. (poor) Scale range: 1–5, 6–10, 11–15, 16–20
    2. (better) Scale range: 0, 1–5, 6–10, 11–15, 16 or over

Tallying the responses will provide a “score” that assists in making a decision that requires the use of quantifiable information. When using interval scales, keep in mind that not all questions will carry the same weight. Hence, it is a good idea to use a weighted average formula during calculation. To do this, assign a “weight” or level of importance to each question. For example, the aforementioned question might be assigned a weight of 5 on a scale of 1 to 5 meaning that this is a very important question. On the other hand, a question such as “Was the training center comfortable” might carry a weight of only 3. The weighted average is calculated by multiplying the weight by the score (w * s) to get the final score. Thus, the formula is snew = w * s.

There are several problems that might result in a poorly constructed questionnaire. Leniency is caused by respondents who grade nonsubjectively—in other words, too easily. Central tendency occurs when respondents rate everything as average. The halo effect occurs when the respondent carries his or her good or bad impression from one question to the next.

There are several methods that can be used to successfully deploy a survey. The easiest and most accurate is to gather all respondents in a conference room and hand out the survey. For the most part, this is not realistic, so other approaches would be more appropriate. E-mail and traditional mail are two methodologies that work well, although you often have to supply an incentive (i.e., prize) to get respondents to fill out those surveys on a timely basis. Web-based surveys (Internet and intranet) are becoming increasingly popular as they enable the inclusion of demos, audio, and video. For example, a web-based survey on what type of user interface is preferable could have hyperlinks to demos or screen shots of the choices.

Observation

Observation is an important tool that can provide a wealth of information. There are several forms of observation: silent and directed. In silent observation, the analyst merely sits on the sidelines, pen and pad, and observes what is happening. If it is suitable, a tape recorder or video recorder can record what is being observed. However, this is not recommended if the net result will be several hours of random footage.

Silent observation is best used to capture the spontaneous nature of a particular process or procedure. For example,

  1. When customers will be interacting with staff
  2. During group meetings
  3. On the manufacturing floor
  4. In the field

Directed observation provides the analyst with a chance to micro-control a process or procedure so that it is broken down into its observable parts. At one accounting firm, a tax system was being developed. The analysts requested that several senior tax accountants be coupled with a junior staff member. The group was given a problem as well as all the manuals and materials they needed. The junior accountant sat at one end of the table with the pile of manuals and forms while the senior tax accountants sat at the other end. A tough tax problem was posed. The senior tax accountants were directed to think through the process and then direct the junior member to follow through on their directions to solve this problem. The catch was that the senior members could not walk over to the junior person nor touch any of the reference guides. This whole exercise had to be verbal and use just their memories and expertise. The entire process was videotaped. The net result was that the analyst had a complete record of how to perform one of the critical functions of the new system.

Participation

The flip side of observation is participation. Actually becoming a member of the staff, and thereby learning exactly what it is that the staff does so that it might be automated, is an invaluable experience.

Documentation

It is logical to assume that there will be a wide variety of documentation available to the analyst. This includes, but is not limited to the following:

  1. Documentation from existing systems. This includes requirements and design specifications, program documentation, user manuals, and help files. This also includes whatever “wish lists” have been developed for the existing system.
  2. Archival information.
  3. Policies and procedures manuals.
  4. Reports.
  5. Memos.
  6. Standards.
  7. E-mail.
  8. Minutes from meetings.
  9. Government and other regulatory guidelines and regulations.
  10. Industry or association manuals, guidelines, standards (e.g., accountants are guided not only by in-house “rules and regulations,” but also by industry and other rules and regulations).

Brainstorming

In a brainstorming session, you gather together a group of people, create a stimulating and focused atmosphere, and let people come up with ideas without risk of being ridiculed. Even seemingly stupid ideas may turn out to be “golden.”

Focus Groups

Focus groups are derived from marketing. These are structured sessions where a group of stakeholders are presented with a solution to a problem and then are closely questioned on their views about that solution.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset