CASE STUDY 2 – EXECUTING THE ASSESSMENTS

Introduction to case study

The previous case study focused on the definition of a bespoke framework for a specific business. This section looks at another case study, but this time from the point of view of the assessments themselves. Therefore, the emphasis here is on the ‘Assessment Set-up’ and ‘Assessment’ processes in UCAM. The source framework to be used was, this time, to be a direct implementation of the INCOSE framework. This framework is deliberately chosen as it allows the readers to compare and contrast the approach taken between this case study and the previous one, where a bespoke framework based on INCOSE was used.

The company for this case study was, in contrast to the previous example, a large multi-national organisation with many thousands of employees. It was decided to run a pilot project where a number of people with different backgrounds would be assessed against the same scope in order to see how useful the frameworks and the processes would be to the business.

The first step was to understand why the company wanted to carry out the work in the first place. The basic requirements were as follows:

  • to understand how competency could be used to help the general business in terms of internal assessments;

  • to see if competency assessments could provide an ‘edge’ over competitors who may not use such an approach;

  • to assess how useful one particular source framework is for the purposes of the main business.

With these in mind, it was decided to carry out the assessments on a small sample of systems engineers using the INCOSE framework. The UCAM processes were then executed.

The framework definition process

The first process that was executed as part of the whole UCAM assessment was the ‘Framework Definition’ process. In this case study, the INCOSE framework was identified by the company as being their source framework. The INCOSE framework model was used and then mapped onto the UCAM meta-model.

This has been covered in detail elsewhere in this book, so no further thought will be given to it at this point.

The framework population process

The framework population process is mainly concerned with defining the applicable competency set and the evidence types that are to be accepted as demonstration for the indicators. The applicable competency set was generated based on the roles that were to be assessed within the organisation. As this was an experimental assessment, it was decided that the definition of the applicable competency set did not need to be 100 per cent correct at this point as the results of this assessment exercise would then be used as an input to the main assessment programme. The applicable competency set was defined as being interested in the following competencies:

  • Systems thinking – systems concepts. This competency covers the absolute fundamentals of systems engineering and was therefore seen as being an essential competency for all systems engineers.

  • Systems thinking – super-system capability issues. Due to the nature of the work carried out by the company, it is important that the engineers can see the ‘bigger picture’ of the systems on which they work, therefore this competency was seen as being very important.

  • Holistic life-cycle view – determine and manage stakeholder requirements. As with many systems engineering organisations, there is a great deal of import associated with getting the requirements of a project right; therefore, this was seen as an essential competency for all staff in the organisation.

  • Holistic life-cycle view – systems integration and verification. Continuing from the last point, systems integration forms a large part of the company’s work. Whenever systems are integrated, there is a great need for the various system elements to work together to form the overall systems, which highlights this competency as essential.

  • Holistic life-cycle view – validation. This ensures that a system meets the original requirements – it does what it is supposed to do.

  • Holistic life-cycle view – functional analysis. This covers the basics of functional analysis as applied to a system.

  • Holistic life-cycle view – modelling and simulation. This covers modelling and simulation at the broadest level. As a point of interest, remember that in the previous case study, this competency was deemed to be too generic, and a whole set of other competencies was defined.

  • Systems engineering management – life-cycle process definition. A key part of any systems engineering activity is to have a good approach in place. An essential part of any approach is a good process; therefore, all engineers must have at least an appreciation of the key processes and how they fit in the overall life cycles associated with projects, products and programmes.

  • Systems engineering management – planning, monitoring and controlling. This covers basic project management principles. Again, for interest’s sake, this single competency would map onto almost the whole of the APM framework.

These competencies form the applicable competency set. The applicable competency set is used as a basis for defining the competency scopes for each role in the organisation. These competencies will, therefore, form the horizontal axis of the competency scope, with the actual competency levels forming the vertical axis.

In terms of the evidence types that would be used at each level, there was a completely different approach taken to the one that was used in the previous case study. In that example, the evidence types were very well defined and the criteria for each indicator were very strict. In this example, the opposite approach was taken, in that the criteria were very loose. This was decided upon as this set of assessments was to be used as an input to a larger assessment programme. With this in mind, it was decided that very few evidence types would be defined and that the assessors would decide, on an assessment-by-assessment basis, what the appropriate criteria would be. In this way it would then be possible to abstract out a common set of criteria to be used at each level which would then form the definition of a more formal set of evidence types for future assessments.

The evidence types that were given to the assessors as guidance were simple:

  • Level 1 – awareness: tacit knowledge, informal training course;

  • Level 2 – supervised practitioner: formal course, activity;

  • Level 3 – practitioner: educational qualification, lead activity;

  • Level 4 – expert: professional qualification, publication and activity definition.

These evidence types should look familiar by now, as these are just the generic types that were introduced previously as a good starting point for thinking about evidence type definition. The explanations, therefore, for each one are kept to a minimum to avoid unnecessary duplication.

The resultant applicable competency set is shown in Figure 5.4.

Figure 5.4 shows the applicable competency (the themes and competencies taken from the INCOSE framework) set along the horizontal axis of the table and the levels up the vertical axis. The evidence types have been entered into each cell on the chart. Note that these evidence types are the same for each level; this would not happen when the definition is carried out more rigorously but, as this is being used as a starting point only, this is adequate.

Figure 5.4 The applicable competency set

The assessment set-up process

The main aim of the assessment set-up process is to define a competency scope for each of the roles that is to be assessed. The roles that were decided upon were: requirements engineer, development manager, tutor and graduate. Each of these will be discussed in more detail along with their associated profile.

In order to generate the scopes for the assessments, it was important to look at what activities and responsibilities were associated for each role, but there were also some additional criteria. It is relatively easy to generate a scope based solely on the role, but it is also necessary to look at the context in which the assessment is taking place and to ask why the assessment is being carried out and what will happen to the results.

In terms of the rationale behind the assessment, there were several reasons identified as drivers for the assessment:

  • to generate profiles for each member of staff that could then be used as a basis for placing consultants into client organisations;

  • to identify gaps in staff competencies that can be used as a basis for generating a training programme to fill these gaps;

  • to provide the company with a competitive edge when it comes to bidding and tendering for new work. As part of an industry shift towards looking for competent individuals, rather than just well-qualified individuals, some of the company’s clients are starting to ask for competency profiles (the output of the assessment) as well as just CVs for staff that are to be included as part of the bid proposal.

With this in mind, the scopes were defined as discussed in the following sections.

The ‘Requirements engineer’ scope

Figure 5.5 shows the scope that was defined for the role of ‘requirements engineer’ in the organisation.

Figure 5.5 The competency scope for the ‘requirements engineer’ role

Figure 5.5 shows the competency scope for the requirements engineer role. The relevant levels for each of the competencies are shown by shading in the relevant cells.

There are some interesting features to this scope when the shape itself is considered. First of all, notice that it is not a ‘flat’ shape, but has highs and lows. The highest level on this scope is ‘Level 3 – Practitioner’ which is typical for most engineers. The areas in which the requirement for level 3 is present are related to the role name. Anyone who is involved in requirements engineering would be expected to have a good appreciation of systems engineering generally (the ‘Systems thinking’ themed competencies) and would be expected also to be at the same level for requirements-related life-cycle competencies. This includes ‘Determine and managing stakeholder requirements’ which is the obvious competency, but also two other competencies that require this high level are closely related: ‘Functional analysis’ and ‘Modelling and simulation’.

Looking at the ‘Systems engineering management’ theme, there is an interesting pattern there also. Both ‘Life cycle process definition’ and ‘Planning monitoring and controlling’ are required competencies, but only at ‘Level 1 – Awareness’. This is quite typical as the scope is asking that the individual understands management (level 1) but is not expecting any relevant experience in this area.

The ‘Development manager’ scope

Figure 5.6 shows the scope that was defined for the role of ‘development manager’ in the organisation.

Figure 5.6 The competency scope for the ‘development manager’ role

Figure 5.6 shows the competency scope for the development manager role. The relevant levels for each of the competencies are shown by shading the relevant cells. Notice that the same competencies are present, when compared to the requirements engineer scope, but this time the pattern is very different indeed. The ‘Systems thinking’ themed competencies are required at a high level in this scope, which is even higher than the requirements engineer scope. The life-cycle-related competencies here, however, are much lower, with four of them only being held at level 1. This is important, as it demonstrates that the engineers who may be performing the technical work (in this example, the requirements engineer) have far higher competencies in the relevant areas than the people who are managing them. This is completely normal and in some ways some aspects of these two scopes may be viewed as complementary. This is further demonstrated by the very high level required for the management-related competencies, whereas the more technical role only required level 1 for these.

One of the points made by the organisation during the discussion was that, in the past, there had been problems with managers who had little or no understanding of some of the technical concepts and, therefore, made them very inefficient managers. There was a definite requirement, therefore, that any manager must hold the ‘Level 1 – Awareness’ level in any area in which they are expecting to manage. Generally speaking, this is a very good piece of best practice.

The ‘Tutor’ scope

Figure 5.7 shows the scope that was defined for the role of ‘tutor’ in the organisation.

Figure 5.7 shows the competency scope for the tutor role. The relevant levels for each of the competencies are shown by shading the relevant cells. The scope here is really asking for a very high-level indeed in many areas. The reasoning behind this is that the competencies of the engineering staff in the company, bearing in mind that this is an engineering company, rely almost entirely on the knowledge and skills of the tutors who are responsible for training and mentoring staff.

This particular scope may look as if it is asking for the world, but bear in mind how crucial this role is for the company. Also bear in mind how few people could actually match a scope like this, and it goes to demonstrate that recruitment for such a role may take a long time indeed and that it may require looking for someone with an established reputation in the relevant field.

Figure 5.7 The competency scope for the ‘tutor’ role

The ‘Graduate’ scope

Figure 5.8 shows the scope that was defined for the role of ‘graduate’ in the organisation.

The chart here shows the competency scope for the graduate role. The relevant levels for each of the competencies are shown by shading the relevant cells. The first thing that leaps out immediately with this scope is that there is not much in it at all. In fact, only three competencies have any level defined and these areas are all at ‘level 1 – awareness’. The big question is – is it really worth defining such a simple and empty scope? In order to answer this question, first of all consider with what skills a graduate may leave a university. In most cases, it may be reasonable to expect them to be aware of what a system is, as the term ‘system’ appears in just about every discipline of science, engineering or any other for that matter. Also, every graduate would have been involved in some sort of final-year project where they would have been expected to manage their, or their team’s, time and resources. Therefore, it is not unreasonable to ask for awareness of key management concepts.

Figure 5.8 The competency scope for the ‘graduate’ role

This scope as it stands may look fairly pointless but, as with all these scopes, it is only when they are compared to their associated competency profiles (the output of the assessments) that they really come into their own. In the case of the graduate scope, however, the real value starts to be added once more than one assessment has been performed, as this is when it is possible to see the trend of the competence of the individual or, to put it a better, to see the ‘evolution’ of the individual’s competence – more on this later. Now that a number of competency scopes have been defined, it is possible to carry out the actual assessments, based on these scopes.

The ‘Assessment’ process

The fourth and final of the core UCAM processes to be performed was the ‘Assessment’ process where the actual assessments are carried out. Each of the assessments was concerned with seven competencies that were assessed to an average of level 3 (some were level 2 and 4, but the average is used as a general indication). On the basis of this, the time allocated for a single assessment was three hours, which would include the pre-assessment meeting, the assessment and the post-assessment meeting.

The pre-assessment meeting was where the two assessors met and briefly looked over the assessment scope and any information that was presented to them about the client. The roles of primary and secondary assessor were confirmed (in this case study, there was a set of assessors who could play either role) and the room readied for the actual assessment.

The assessment itself consisted of an informal interview where the two assessors, the primary and secondary assessor, asked leading questions that were geared towards exploring the assessee’s knowledge of systems engineering, based on the competencies identified in the assessment scope. There are a few key points that had to be borne in mind when carrying out these assessments:

  • The assessment should be non-intimidating and it should be stressed to the assessee that the assessment is not being carried out in order to catch them out or to expose gaps in their knowledge for any sinister reasons. Many people will view a competency assessment as a ‘witch hunt’ so must be treated with great care. Each assessee must be able to see the value, at least at a high level, of why the assessment is required. To this end, the first five minutes of the interview consisted of a brief introduction and the primary assessor would provide an overview of the assessment, the process and answer any initial questions from the assessee.

  • The sessions were deliberately not run simply by reading through the INCOSE competency descriptions and asking a direct question for each level. This is a very unnatural way to establish a connection with a fellow human being and an important goal that must be strived for is to get the assessee to be open and honest about their competencies and achievements. For example, a leading question may be something like: ‘Could you please tell me what “requirements” mean to you in your current role?’, rather than: ‘Do you know what a requirement is?’, ‘Are you able to identify stakeholders?’, ‘What is a quality requirement?’ and so on. Indeed, people are far more likely to provide more information when the assessment is conducted as a general discussion, rather than an interrogation.

  • Considering the two assessors, it is essential that the primary assessor has relevant competencies. For example, when using the INCOSE framework, it was decided that it was essential that the primary assessor holds at least a level 3 (practitioner) in all the ‘systems thinking’ competencies, level 3 of ‘life-cycle process definition’ and level 1 (‘awareness’) in all other relevant competencies. Indeed, a full competency scope for assessors has also been defined and this is an essential part of any assessment.

  • Evidence was recorded by both assessors, who then compared their results at the end of the session and agreed on the final results.

An example of how the results were recorded is shown in Table 5.4.

Table Table 5.4 Example of recording the results of an assessment
Competency reference: ‘System concepts’, level 1Rating scheme
IndicatorEvidence typeEvidencePass/Fail% rangeLevel rating
Is aware of system life cycleInformal course, tacit knowledgeFormal course certificatePass81%–100%Fully met
Is aware of hierarchy of systemsInformal course, tacit knowledgeNo evidenceFail56%–80%Largely met
Is aware of system contextInformal course, tacit knowledgeNo evidenceFail 11%–55% Partially met
Is aware of interfacesInformal course, tacit knowledgeInformal course certificatePass0%–10%Not met
Rating % 50%

The chart here shows the competency name and level at the top of the table. The left-hand column shows the indicators that are taken directly from the source framework. The second column shows the evidence types that were to be looked for by the assessors. As was discussed earlier, in this case study, these evidence types were kept to a minimum and much of the interpretation of results was left to the professional discretion of the assessors.

The third column is left blank on the form itself as this is where the evidence is recorded during the interview, shown here with the information already completed. The final column shows whether the assessor believes that the assessee has provided enough evidence to obtain a pass or fail for this indicator.

The ‘Rating’ section at the bottom of the table is a straight percentage of the pass or fail ratio that is used by the rating scheme to decide what level of competence is awarded. It was pointed out in the previous case study that the INCOSE framework is often inconsistent in the number of indicators defined for competencies, and here we can see an example of this. As there are only four indicators, then the only results possible are: 0 per cent, 25 per cent, 50 per cent, 75 per cent or 100 per cent. Due to this limited number of results, the difference between two competency grades (such as partially met and largely met) can be as little as a single indicator.

The example here shows that the assessee has achieved 50 per cent as their rating, which translates to an overall ‘partially met’ score for this competency at level 1 – clearly, this person has some room to improve.

Once the results have been recorded for each competency, at each level as specified in the competency scope, then the output can be collated into the competency profile. The competency profile (output) looks suspiciously like the competency scope (input) which is to be expected, but this time, each of the cells may be completed with the actual score achieved to provide the overall profile.

Figure 5.9 summarises the final profile. The basic table from the competency scope was used as the starting point and this time the evidence types are not shown as they were with the scope. The original competency scope is indicated by the border between the faded ‘Not assessed’ text cells and the other cells – any cells out of scope are marked with ‘not assessed’. Any cells that are assessed are included in the scope and, therefore, have the score achieved written into the box (‘Fully met’, ‘Largely met’, ‘Partially met’ or ‘Not met’). Any competencies that have scored ‘Fully met’ at a single level are shaded to increase the visual impact of the results. Any that have not achieved ‘Fully met’ have their score written into the box, which will be one of: ‘Not met’, ‘Partially met’ or ‘Largely met’.

Figure 5.9 Assessment output (profile) for a defined role

Using the profiles

The profiles that were generated proved to be very useful in a number of ways – some expected and some not expected:

  • The profiles were used immediately to place a consultant in a client organisation by matching the profile of the assessment against the original scope for a role. This was a predicted use of the profile.

  • The profiles were used to generate a first step of a new training programme. This was achieved by looking where the largest gaps were between the profile (output) and the scope (input). In the example here, the largest gap appears in the ‘Determining and Managing Stakeholder Requirements’ competency. When all the profiles were considered together, they emerged as a common pattern for all assessees. On the basis of this, it was then possible to look into training options. This was a predicted use of the profile.

  • One of the assessees, during the course of the discussion, stated that they did not hold chartered engineering status, despite having over 20 years of professional experience. When probed further, the assessee said that they were intimidated by the forms and all the necessary information that would need to be provided. The assessee was, therefore, delighted to discover that the results of the assessment could be used directly as evidence for gaining chartered status. This was a non-predicted and positive use of the profile.

  • All assessees stated that they enjoyed the assessments and that, in all cases, they now knew more about both competency assessments and the INCOSE framework. This can be seen as actually raising the profile of INCOSE within industry: another unexpected and positive outcome.

  • The final outcome was that this systems engineering manager now felt that he was in a position to make more use of the profiles as the assessments had been a great ‘learning exercise’. This was seen as a valued contribution to future systems engineering best practice within the business and is intended to be used as an input to a higher-level approach to systems engineering, a part of which is concerned with systems engineering competency assessment.

Overall, the assessment was deemed to be a great success by the assessees, the assessors (naturally) and the sponsor of the work. Another outcome of the assessment was associated with assessing the suitability of the source framework, the INCOSE competencies framework, for use in the organisation.

Observations on the INCOSE framework

With regard to using the INCOSE framework, all the assessments that were carried out for this case study have been essential learning exercises, which can be used to further the use of the framework. It was always intended that any feedback would be gathered and then fed back to INCOSE and a few high-level observations are made here:

  • Some of the competency descriptions are not very well balanced. For example, the ‘determine and manage stakeholder requirements’ competency has a total of 33 indicators across the four levels, whereas ‘design for …’ has only 10 indicators. Also, two of the levels have only a single indicator assigned to each, resulting in possible scores of only 0 per cent or 100 per cent for each level. This has been discussed thoroughly in the previous two case studies.

  • Many of the competencies cross-relate to one another and more work could be done in this area. For example, how does the competency for modelling relate to the competency for architecture design? Many of these competencies are very closely related and may well be dependent on one another. This is a feature that is present in the capability assessment world and one that could bear further investigation here.

  • Some of the competencies don’t really match their descriptions, such as ‘life-cycle process definition’, which relates mainly to the understanding of life cycles rather than processes. This can be misleading, particularly when the assessors are looking at headings as a basis for asking questions, as the results can be misleading.

It must be stated categorically, however, that the basic conclusion was that the INCOSE systems engineering competencies framework is an excellent tool for the systems engineering community and, although there are a few problems with it, this is natural for an evolving entity such as this framework. Indeed, these observations were fed back to INCOSE and, at the time of writing, there is another issue of the framework in the process of being released. Finally, bear in mind that comments such as these could be derived from assessments using any other framework.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset