WP2 1 Evaluation Strategy Design » History » Version 7
Version 6 (Steve Welburn, 2012-08-08 03:57 PM) → Version 7/10 (Steve Welburn, 2012-08-08 03:57 PM)
h1. WP2.1 Evaluation Strategy Design
In order to evaluate the (training) materials, it is necessary to:
* identify specific (learning) objectives which they aim to meet
* evaluate whether the materials meet those objectives
additionally, we need to
* identify the overall purposes of the materials
* evaluate whether the cumulative objectives satisfy these overall purposes
h2. Methods of Evaluation
Evaluation of materials
On-going evaluation of course
* Informal / Formal Review e.g.:
** "Cloze Test":http://en.wikipedia.org/wiki/Cloze_test to assess readability
** Questionnaire to see how easy it is to find relevant material
** Internal use of materials
** "I-Tech Instructional Design & Materials Evaluation Form and Guidelines":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html
** Focus Groups ("I-Tech guide":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html)
* In-course / formative evaluation (see "I-Tech":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/dev_eval.html)
** Assessment of level of knowledge within training group
** Checking progress with participants
** Trainer assessment - self assessment and from other trainers if possible
** Pre- and Post-course questionnaires - assess change in answers (true/false + multiple-choice)
* Post-course / summative evaluation
** Debriefing of trainer (did it work ? to time ? did it engage people ?)
** Questionnaire for participants "Sample training evaluation forms":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html
** Medium-term review of usefulness of course content / adoption of techniques (e.g. 2-3 months after course)
"Kirkpatrick's Four Level Evaluation Model":http://www.nwlink.com/~donclark/hrd/isd/kirkpatrick.html
* Reaction - to the course ("motivation" may be more appropriate)
** Pacing, was it enjoyable
* Learning - from the course
** Did the facts get across ?
* Behavior - changes after the course
** Did participants actually manage their data better
*** during research ?
*** at the end of research ?
** Have data management plans been produced for grant proposals ?
* Results or Impact - to a wider community
** Did they publish data ?
** Was any data loss avoided ?
Review content for:
* reading level
* correctness
* organization
* ease of use
based on the target audience
h2. Tools and links
"Bristol Online Surveys":http://www.survey.bris.ac.uk/
"I-Tech Training Toolkit":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/index.html
* Instructional System Design approach to training
* "Analysis, Design, Development, Implement, Evaluate (ADDIE)":http://www.nwlink.com/~donclark/hrd/sat6.html
Free Managemnt Library - "Evaluating Training and Results (ROI of Training)":http://managementhelp.org/training/systematic/ROI-evaluating-training.htm
Lingualinks "Implement A Literacy Program - Eavluatnig Training":http://www.sil.org/lingualinks/literacy/ImplementALiteracyProgram/EvaluatingTraining.htm
In order to evaluate the (training) materials, it is necessary to:
* identify specific (learning) objectives which they aim to meet
* evaluate whether the materials meet those objectives
additionally, we need to
* identify the overall purposes of the materials
* evaluate whether the cumulative objectives satisfy these overall purposes
h2. Methods of Evaluation
Evaluation of materials
On-going evaluation of course
* Informal / Formal Review e.g.:
** "Cloze Test":http://en.wikipedia.org/wiki/Cloze_test to assess readability
** Questionnaire to see how easy it is to find relevant material
** Internal use of materials
** "I-Tech Instructional Design & Materials Evaluation Form and Guidelines":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html
** Focus Groups ("I-Tech guide":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html)
* In-course / formative evaluation (see "I-Tech":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/dev_eval.html)
** Assessment of level of knowledge within training group
** Checking progress with participants
** Trainer assessment - self assessment and from other trainers if possible
** Pre- and Post-course questionnaires - assess change in answers (true/false + multiple-choice)
* Post-course / summative evaluation
** Debriefing of trainer (did it work ? to time ? did it engage people ?)
** Questionnaire for participants "Sample training evaluation forms":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/forms.html
** Medium-term review of usefulness of course content / adoption of techniques (e.g. 2-3 months after course)
"Kirkpatrick's Four Level Evaluation Model":http://www.nwlink.com/~donclark/hrd/isd/kirkpatrick.html
* Reaction - to the course ("motivation" may be more appropriate)
** Pacing, was it enjoyable
* Learning - from the course
** Did the facts get across ?
* Behavior - changes after the course
** Did participants actually manage their data better
*** during research ?
*** at the end of research ?
** Have data management plans been produced for grant proposals ?
* Results or Impact - to a wider community
** Did they publish data ?
** Was any data loss avoided ?
Review content for:
* reading level
* correctness
* organization
* ease of use
based on the target audience
h2. Tools and links
"Bristol Online Surveys":http://www.survey.bris.ac.uk/
"I-Tech Training Toolkit":http://www.go2itech.org/HTML/TT06/toolkit/evaluation/index.html
* Instructional System Design approach to training
* "Analysis, Design, Development, Implement, Evaluate (ADDIE)":http://www.nwlink.com/~donclark/hrd/sat6.html
Free Managemnt Library - "Evaluating Training and Results (ROI of Training)":http://managementhelp.org/training/systematic/ROI-evaluating-training.htm
Lingualinks "Implement A Literacy Program - Eavluatnig Training":http://www.sil.org/lingualinks/literacy/ImplementALiteracyProgram/EvaluatingTraining.htm