WP2.1 Evaluation Strategy Design¶
In order to evaluate the (training) materials, it is necessary to:- identify specific (learning) objectives which they aim to meet
- evaluate whether the materials meet those objectives
additionally, we need to - identify the overall purposes of the materials
- evaluate whether the cumulative objectives satisfy these overall purposes
In order to produce the best possible materials, it is necessary to evaluate and then revise the materials. Initial evaluation of the materials will take place once a first draft has been created, but before they are used in training. This will concentrate on the suitability and level of the content. After the initial evaluation and update, the materials will be used in training courses and begin an ongoing series of formative and summative evaluations (i.e. evaluations during and after the training). These evaluations will apply Kirkpatrick's four-level evaluation model1
Methods of Evaluation¶
Design review
Pre-course (evaluation of materials)- Cloze Test to assess readability
- I-Tech Instructional Design & Materials Evaluation Form and Guidelines
- Focus Groups (I-Tech guide) e.g. some of C4DM
- Internal use of materials i.e. C4DM
On-going evaluation of course
- Informal / Formal Review e.g.:
- Questionnaire to see how easy it is to find relevant material / test users knowledge
- Focus Groups (I-Tech guide)
- In-course / formative evaluation (see I-Tech)
- Assessment of level of knowledge within training group
- Checking progress with participants
- Trainer assessment - self assessment and from other trainers if possible
- Pre- and Post-course questionnaires - assess change in answers (true/false + multiple-choice)
- Post-course / summative evaluation
- Debriefing of trainer (did it work ? to time ? did it engage people ?)
- Questionnaire for participants Sample training evaluation forms
- Medium-term review of usefulness of course content / adoption of techniques (e.g. 2-3 months after course)
- Reaction - to the course ("motivation" may be more appropriate)
- Pacing, was it enjoyable
- Learning - from the course
- Did the facts get across ?
- Behavior - changes after the course
- Did participants actually manage their data better
- during research ?
- at the end of research ?
- Have data management plans been produced for grant proposals ?
- Did participants actually manage their data better
- Results or Impact - to a wider community
- Did they publish data ?
- Was any data loss avoided ?
- reading level
- correctness
- organization
- ease of use
based on the target audience
Tools and links¶
Instructional System Design approach to trainingFree Managemnt Library - Evaluating Training and Results
Lingualinks Implement A Literacy Program - Evaluating Training
Training Works!... ...what you need to know about managing, designing, delivering, and evaluating group-based training
References¶
[1] Kirkpatrick, D. L. (1959). Techniques for evaluating training programs. Journal of the American Society of Training Directors, 13, 3–9.
[2] Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development (2nd ed., pp. 301–319). New York: McGraw-Hill