THE FRAMEWORK POPULATION PROCESS

This section describes the ‘Framework Population’ process introduced in ‘The UCAM processes’ section above and executed in order to establish the competencies from source frameworks that will be used as a basis of assessment and the evidence that will be accepted to support an assessment.

Requirements for the Framework Population process

The requirements for the ‘Framework Population’ process are given below in Figure 4.8 which highlights the relevant requirements on the diagram introduced in Figure 4.1 above.

Figure 4.8 Requirements for the Framework Population process

As can be seen in Figure 4.8, the main requirement for the ‘Framework Population’ process is to ‘Populate framework’. The activities that are needed in the process to meet this requirement, and the artefacts that form inputs to and outputs from the process are described in the following section.

Contents of the Framework Population process

The contents of the ‘Framework Population’ process are shown in Figure 4.9, annotated to show whether the various artefacts are inputs or outputs from the process.

Figure 4.9 shows that there are six activities that have to be carried out in the ‘Framework Population’ process, namely ‘define rating scheme’, ‘identify applicable competency set’, ‘identify evidence types’, ‘map evidence types to levels per competency’, ‘generate record sheets’ and ‘review’. The process takes the ‘Source Framework Model(s)’ output from the ‘Framework Definition’ process and uses them to generate the other artefacts, such as ‘Applicable Competency Set’ and ‘Evidence Type’, that are shown in Figure 4.9. The way that the ‘Framework Population’ process is carried out is shown in the following section.

Figure 4.9 The contents of the Framework Population process

Carrying out the Framework Population process

Figure 4.10 shows how the ‘Framework Population’ process is carried out. The ‘soft boxes’ (rectangles with rounded corners) represent the various activities that have to be carried out and correspond with those shown in the bottom compartment in Figure 4.9. The vertical divisions (swim lanes) indicate which stakeholder role is responsible for carrying out which activity, and correspond to one or more of the stakeholder roles identified in Figure 4.9. The small rectangles containing arrows (known as ‘pins’) show inputs to and outputs from the various activities, with the name of the artefact flowing into or out of the activity shown on the line connecting the pins. These artefacts correspond to those found in the middle compartment in Figure 4.9. The two thick horizontal lines at top left indicate that the activities between the lines (‘identify evidence types’, ‘define rating scheme’, and so on) can conceptually take place in parallel. This region of the diagram looks somewhat complex with all the pins with lines going into them, but simply shows that the ‘Source Framework Model’ artefact is an input to all three activities in the parallel region.

Figure 4.10 shows that the ‘Framework Population’ process begins with the ‘Assessment Tailor’ taking the ‘Source Framework Model’ artefacts as inputs to the process and undertaking the ‘identify evidence types’, ‘identify applicable competency set’ and ‘define rating scheme’ activities in parallel. These activities are intended to achieve the following:

  1. ‘identify evidence types’. This activity uses the ‘Source Framework Model(s)’ to identify the ‘Evidence Type(s)’ that are deemed to be acceptable to support a competency assessment. Associated with each ‘Evidence Type’ is a ‘Timeliness’ that specifies the acceptable time limits for the validity of a piece of evidence.

  2. ‘identify applicable competency set’. This activity is executed to identify those competencies from within the ‘Source Framework Model(s)’ that are deemed to be applicable to the work undertaken by the organisation. If a competency framework contains competencies that are not relevant to an organisation, then there is little point in assessing against them. The ‘Applicable Competency Set’ that is output by this activity contains just those competencies from the ‘Source Framework Model(s)’ that are relevant to the organisation.

  3. ‘define rating scheme’. In order to generate the output from a competency assessment (a ‘Competency Profile’, discussed in the section on the ‘Assessment’ process below), it is necessary to be able to convert the simple pass or fail result that is recorded against each competency ‘Indicator’ into a rating that can be used to generate the ‘Competency Profile’. This activity is executed to generate such a ‘Rating Scheme’.

Figure 4.10 Carrying out the Framework Population process

Once these three activities have been completed, a check is made that all are complete and that the various artefacts are fit for purpose. If there are any problems, then the process returns to the beginning and the three activities are repeated. If all is OK, then the ‘map evidence types to level per competency’ activity is executed. This activity takes the ‘Applicable Competency Set’ and ‘Evidence Type(s)’ and assigns an ‘Evidence Type’ to each competency-level combination for all the competencies in the ‘Applicable Competency Set’. The resulting artefact is known as the ‘Populated Applicable Competency Set’, since it has been populated with relevant ‘Evidence Type(s)’ and can now be used to help set up assessments, as described in the section below on the ‘Assessment Set-up’ process. This activity is repeated until all competencies and levels have been addressed. When all have been considered, the ‘generate record sheets’ activity is executed. This simple activity generates the ‘Record Sheet(s)’ that is used during an assessment to record the results of the assessment. Finally, the ‘review’ activity is carried out by the ‘Reviewer’ in order to check that all the artefacts generated by the process are fit for purpose. If everything is OK, then the process ends. Otherwise, the process restarts. The various artefacts of the ‘Framework Population’ process and their relationships are discussed further in the following section.

Artefacts of the Framework Population process

The main output from the ‘Framework Population’ process is the ‘Populated Applicable Competency Set’ that is used as the basis for the competency scopes generated by the ‘Assessment Set-up’ process, as described in the discussion of that process below. The artefacts of the ‘Framework Population’ process and the relationships between them are shown in Figure 4.11.

Figure 4.11 Relationships between the artefacts of the Framework Population process

At the heart of the artefacts, as can be seen in Figure 4.11, is the ‘Applicable Competency Set’. This is a subset of competencies that are deemed relevant to the organisation as abstracted from the ‘Source Framework Model(s)’ that form an input to the ‘Framework Population’ process. The ‘Applicable Competency Set’ contains these competencies via one or more ‘Competency Reference’, which is simply a reference to the competency in its source framework. As well as identifying the relevant competencies, the ‘Applicable Competency Set’ also identifies the maximum ‘Level’ to which that competency will be assessed in any assessments conducted by the organisation.

In order to capture the results of an assessment, a ‘Record Sheet’ is needed. This applies to those competencies and levels identified in the ‘Applicable Competency Set’. Assessed competencies are rated according to the ‘Rating Scheme’, an example of which is given in Figure 4.12 below.

Figure 4.12 Example ‘Populated Applicable Competency Set’

When an ‘Applicable Competency Set’ is combined with ‘Evidence Type’, then they together form the ‘Populated Applicable Competency Set’ that is the main output from the process. An example of a ‘Populated Applicable Competency Set’ is given in Figure 4.12.

The ‘Populated Applicable Competency Set’ in Figure 4.12 shows all those competencies that the organisation deems to be relevant the work it undertakes, together with the maximum level to which each competency can be assessed along with the ‘Evidence Type’ that will be accepted for each competency-level combination. There are a number of points to be noted about the information shown in Figure 4.12:

  • All competencies are shown as theoretically capable of being assessed to ‘Expert’ level, but there is no reason why the maximum level should be the same for all competencies.

  • All the competencies shown happen to be from the same source framework, but the whole point of the UCAM approach is that this need not be the case and, in reality, the ‘Populated Applicable Competency Set’ will contain competencies from a number of different frameworks.

  • The same ‘Evidence Type’ is shown for all competencies at a given level. Again, there is no reason why this needs to be the case.

It is important that the various ‘Evidence Type(s)’ are clearly understood and that the ‘Timeliness’ that defines the validity of each type is recorded. A simple example of this is given in Table 4.1.

Table Table 4.1 Example of ‘Evidence Types’ and associated ‘Timeliness’
Evidence typeDescriptionTimeliness
Informal courseA training course that is not recognised by a professional body.2 years
Tacit knowledgeKnowledge that the assessee can demonstrate through conversation.5 years
Formal courseA training course that is recognised by a professional body.3 years
ActivityAn activity relevant to the competency (or indicator) that the assessee has undertaken under supervision.2 years
Educational qualificationA qualification recognised by an educational body such as a university.30 years
Lead activityAn activity relevant to the competency (or indicator) that the assessee has undertaken and for which they have taken a lead role.3 years
Professional qualificationA qualification granted by a professional body, such as a CEng, CITP.5 years
PublicationA publication, such as a paper or book, for which the assessee is a main contributor.5 years (papers and so on) 10 years books
Activity definitionActivities that the assessee has defined, such as organisational process or policy.5 years

Table 4.1 lists a number of different values for ‘Evidence Type’, along with a brief description of each. Each ‘Evidence Type’ also has the validity period, or ‘Timeliness’ for that ‘Evidence Type’ shown. The table shows a simple maximum age for each type, but much more complex schemes can be defined, such as having ‘Timeliness’ depend on the actual competency and level. Of course, this makes the conduct of assessments more difficult and time-consuming.

A ‘Rating Scheme’ is used during the ‘Assessment’ process (see below) to convert a percentage calculated from the number of indicators of a competency that are marked as having been passed into a ‘Level Rating’ that gives a textual description of the degree to which a given competency has been met at a given level. An example ‘Rating Scheme’ is shown in Table 4.2.

Table 4.2 shows the mapping from percentage to ‘Level Rating’. Thus, for a competency with five indicators at a particular level, three of which are considered to have been passed, the ‘Rating Scheme’ gives a ‘Level Rating’ of ‘Largely met’, since the percentage of indicators that have been passed for that competency and level is 60 per cent. The percentages and level ratings shown in Table 4.2 are those that have been found to be most useful by the authors and may not meet the requirements for every organisation.

Table Table 4.2 Example ‘Rating Scheme’
Level ratingPercentage of indicators required
Fully met86–100
Largely met56–85
Partially met16–55
Not met0–15

Summary of the Framework Population process

The ‘Framework Population’ process is concerned with defining the ‘Populated Applicable Competency Set’, ‘Evidence Type’ and ‘Rating Scheme’ as appropriate for an organisation or organisational unit such as a business unit or even a project team. These artefacts are the main inputs to the ‘Assessment Set-up’ process and define those competencies that are relevant to the organisation, the evidence that will be accepted and the rating scheme that will be used to convert the results of an assessment into a ‘Competency Profile’, as described below in the section on the ‘Assessment’ process.

Discussion on the Framework Population process

The contents of the populated applicable competency set not only depend on the source frameworks that have been chosen as a basis for competency assessments, but also on the activities carried out by an organisation or organisational unit. There is no point including a competency from a framework in the competency set if it relates to an area of work in which the organisation is not involved.

When it comes to defining evidence types, it is necessary to strike a balance between rigour and flexibility. Evidence has to be strong enough to demonstrate that a competency has been met, but too much rigour can make it impossible to run assessments in anything like a reasonable time. It is also important to think about any additional constraints, such as timeliness, that might need to be associated with evidence types.

In order to convert the simple ‘pass or fail’ marks recorded against each indicator into a graded mark that can be used to generate a competency profile it is necessary to define a rating scheme. When deciding on a rating scheme, it is important to establish the desired granularity of level ratings. Do you want a four-level scheme such as is seen in this book (‘not met’, ‘partially met’, ‘largely met’ and ‘fully met’), a three-level scheme, or a six-level scheme? It is essential that the number of indicators for a competency allows achievement of all level ratings. If this is not possible, then the introduction of additional indicators not found in the original framework may be necessary.

Finally, whatever the contents of the applicable competency set, the types of evidence deemed acceptable and the rating scheme defined, it is essential that the population of the framework be validated with those for whom the assessments are being performed.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset