BRINGING IT ALL TOGETHER – THE UCAM META-MODEL

All of the four main elements that have been discussed may now be brought together in the form of the UCAM meta-model, which is shown in Figure 3.8.

Figure 3.8 shows how the four main elements may be brought together but, very importantly, it shows the relationships between these elements. By relating the various elements together, the meta-model starts to enforce consistency between the different elements and makes the whole approach start to make sense.

The key relationships are as follows:

  • The ‘Applicable Competency Set’ identifies one or more ‘Competency(ies)’. This requirement forms the cornerstone of the whole assessment as it means that each competency in the applicable competency set comes from a source framework somewhere.

  • Each ‘Indicator’ is demonstrated by one or more ‘Evidence Type(s)’. This relationship ties together the definition of the evidence types with the indicators that are measured as part of the assessment.

  • The ‘Competency Scope’ applies to the ‘Applicable Competency Set’. It is essential that the competency scope, one of the main inputs to the actual assessments, is taken from the applicable competency set.

  • The ‘Rating Scheme’ classifies the relationships between ‘Competency Profile’ and ‘Indicator Result Set’. Notice that this defines the nature of the relationship between the two concepts.

  • The ‘Indicator Result Set’ is directly related to the ‘Indicator Set’, as it records the results associated with each indicator.

  • The ‘Indicator Result’ is directly related to the ‘Indicator’, as it records the result for each individual indicator.

Figure 3.7 Example competency profile

Figure 3.8 The UCAM meta-model

The combination of all these elements, along with their relationships, forms the UCAM meta-model that will be used as a basis for the UCAM assessment process, which will be fully described in the next chapter.

Process automation and tools

As with any process, if it is fully understood and defined, then it is possible to automate that process in an attempt to simplify the process and make it more efficient. In recent years, there have been a great many software applications that have been designed to make our lives easier, in terms of carrying out processes and assessments. These may be thought of as being in the following categories.

  • Process automation tools. There are many tools on the market that allow processes to be automated to a certain extent. Consider a process that is defined in terms of its activities (what is done), the activities (what is produced or consumed) and the responsibilities (the stakeholders). A typical process automation tool will guide the user through the various activities, indicate which stakeholder should be involved and provide templates for the artefacts. This is all well and good, but it is essential not to confuse automating a process in terms of guiding someone through it, and actually performing that process. A process automation tool will allow any processes to be automated, but will not capture results. In the event that the tool is to be used to capture the results of an assessment, then assessment tools need to be considered.

  • Assessment tools. These are tools that actually execute a specific assessment process. There is a plethora of such tools that are related to process assessment, but not nearly as many as there are related to competency assessment. A process assessment tool will actually do more than guide a user through the process, but will actually capture the data, analyse the data (in some cases) and present the results in charts, tables and so on.

  • It is important when considering tools that the right one is chosen to suit your individual requirements. Remember, the main key difference is that process automation tools will allow any process to be automated, but will not capture results or do anything clever. Assessment tools will usually execute a specific process but will capture results, analyse them, visualise them, and so on.

Competent assessors

One of the key considerations for pragmatic assessment comes down to the competence of the assessors themselves. There are some core areas where the assessors will require specific competencies:

  • In the process. In the case of this book, the assessor would need to be completely familiar with the relevant UCAM processes, how they work, how to capture results, and so on. The word relevant should be stressed here, as, depending on which of the assessor roles will be taken, they may need to be familiar with a different set of processes.

  • In the relevant competency framework. The assessors must have an understanding of the source framework that is being assessed against. This can be covered in part by the ‘Framework Definition’ process in UCAM, which is geared towards generating an understanding of any relevant source framework.

  • In the appropriate domain. Whoever is carrying out the assessment must have appropriate domain knowledge. It is not enough to read and understand the source frameworks as the information in them is open to misinterpretation and, therefore, an expert decision may need to be made to clear up any ambiguities.

The obvious answer to how this can be achieved is to define a competency scope for the assessor roles and ensure that anyone involved in assessment holds the appropriate levels of competence. There are three main categories of skills required, as discussed in the previous list, and the following levels are recommended.

The table here shows the levels required for each of the three skill areas discussed previously. These levels are based on a four-level system, such as has been used in several examples throughout this book.

Table Table 3.4 Suggested scope levels
 Primary assessorSecondary assessor
UCAM processLevel 3Level 2
Framework understandingLevel 2Level 2
Domain knowledgeLevel 3Level 2

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset