WP2 1 Evaluation Strategy Design » History » Version 10

Steve Welburn, 2012-08-09 03:02 PM

1 1 Steve Welburn
h1. WP2.1 Evaluation Strategy Design
2 1 Steve Welburn
3 2 Steve Welburn
In order to evaluate the (training) materials, it is necessary to:
4 2 Steve Welburn
* identify specific (learning) objectives which they aim to meet
5 2 Steve Welburn
* evaluate whether the materials meet those objectives
6 2 Steve Welburn
additionally, we need to 
7 2 Steve Welburn
* identify the overall purposes of the materials
8 2 Steve Welburn
* evaluate whether the cumulative objectives satisfy these overall purposes
9 1 Steve Welburn
10 10 Steve Welburn
In order to produce the best possible materials, it is necessary to evaluate and then revise the materials. Initial evaluation of the materials will take place once a first draft has been created, but before they are used in training. This will concentrate on the suitability and level of the content. After the initial evaluation and update, the materials will be used in training courses and begin an ongoing series of formative and summative evaluations (i.e. evaluations during and after the training). These evaluations will apply Kirkpatrick's four-level evaluation model[1]
11 10 Steve Welburn
12 1 Steve Welburn
h2. Methods of Evaluation
13 1 Steve Welburn
14 9 Steve Welburn
Design review
15 1 Steve Welburn
16 9 Steve Welburn
Pre-course (evaluation of materials)
17 9 Steve Welburn
* "Cloze Test":http://en.wikipedia.org/wiki/Cloze_test to assess readability
18 9 Steve Welburn
* I-Tech Instructional Design & Materials Evaluation "Form and Guidelines":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html
19 9 Steve Welburn
* Focus Groups ("I-Tech guide":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html) e.g. some of C4DM
20 9 Steve Welburn
* Internal use of materials i.e. C4DM
21 9 Steve Welburn
22 1 Steve Welburn
On-going evaluation of course 
23 1 Steve Welburn
24 5 Steve Welburn
* Informal / Formal Review e.g.:
25 9 Steve Welburn
** Questionnaire to see how easy it is to find relevant material / test users knowledge
26 5 Steve Welburn
** Focus Groups ("I-Tech guide":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html)
27 1 Steve Welburn
* In-course / formative evaluation (see "I-Tech":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/dev_eval.html)
28 1 Steve Welburn
** Assessment of level of knowledge within training group
29 1 Steve Welburn
** Checking progress with participants
30 1 Steve Welburn
** Trainer assessment - self assessment and from other trainers if possible
31 5 Steve Welburn
** Pre- and Post-course questionnaires - assess change in answers (true/false + multiple-choice)
32 5 Steve Welburn
* Post-course / summative evaluation
33 5 Steve Welburn
** Debriefing of trainer (did it work ? to time ? did it engage people ?)
34 5 Steve Welburn
** Questionnaire for participants "Sample training evaluation forms":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html
35 4 Steve Welburn
** Medium-term review of usefulness of course content / adoption of techniques (e.g. 2-3 months after course)
36 5 Steve Welburn
37 5 Steve Welburn
"Kirkpatrick's Four Level Evaluation Model":http://www.nwlink.com/~donclark/hrd/isd/kirkpatrick.html
38 4 Steve Welburn
* Reaction - to the course ("motivation" may be more appropriate)
39 4 Steve Welburn
** Pacing, was it enjoyable
40 4 Steve Welburn
* Learning - from the course
41 4 Steve Welburn
** Did the facts get across ?
42 4 Steve Welburn
* Behavior - changes after the course
43 4 Steve Welburn
** Did participants actually manage their data better
44 4 Steve Welburn
*** during research ? 
45 4 Steve Welburn
*** at the end of research ?
46 4 Steve Welburn
** Have data management plans been produced for grant proposals ?
47 4 Steve Welburn
* Results or Impact - to a wider community
48 2 Steve Welburn
** Did they publish data ?
49 2 Steve Welburn
** Was any data loss avoided ?
50 1 Steve Welburn
51 2 Steve Welburn
Review content for:
52 2 Steve Welburn
* reading level
53 2 Steve Welburn
* correctness
54 1 Steve Welburn
* organization
55 1 Steve Welburn
* ease of use
56 1 Steve Welburn
57 1 Steve Welburn
based on the target audience
58 5 Steve Welburn
59 2 Steve Welburn
h2. Tools and links
60 1 Steve Welburn
61 7 Steve Welburn
"Bristol Online Surveys":http://www.survey.bris.ac.uk/
62 7 Steve Welburn
63 7 Steve Welburn
"I-Tech Training Toolkit":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/index.html 
64 1 Steve Welburn
65 6 Steve Welburn
Instructional System Design approach to training
66 1 Steve Welburn
* "Analysis, Design, Development, Implement, Evaluate (ADDIE)":http://www.nwlink.com/~donclark/hrd/sat6.html
67 1 Steve Welburn
68 6 Steve Welburn
Free Managemnt Library - "Evaluating Training and Results (ROI of Training)":http://managementhelp.org/training/systematic/ROI-evaluating-training.htm
69 8 Steve Welburn
70 8 Steve Welburn
Lingualinks "Implement A Literacy Program - Evaluating Training":http://www.sil.org/lingualinks/literacy/ImplementALiteracyProgram/EvaluatingTraining.htm
71 8 Steve Welburn
72 9 Steve Welburn
"Training Works!...":http://www.reproline.jhu.edu/english/6read/6training/Tngworks/index.htm ...what you need to know about managing, designing, delivering, and evaluating group-based training
73 10 Steve Welburn
74 10 Steve Welburn
h2. References
75 10 Steve Welburn
76 10 Steve Welburn
[1] Kirkpatrick, D. L. (1959). Techniques for evaluating training programs. Journal of the American Society of Training Directors, 13, 3–9.
77 10 Steve Welburn
78 10 Steve Welburn
[2] Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development (2nd ed., pp. 301–319). New York: McGraw-Hill