Evaluation questions for participants

Thu, 08/03/2017 - 10:28 -- deli65

I am wondering if anyone has a set of evaluation questions they use following a debrief, training or process improvment event they use? I would like to incorporate the three RBA questions into the evaluation as well. The responses are to be used to continuing improve the training, faciliation of meeting and events.

Thanks

up
0 users have voted.

Submitted by deli65 on


Any examples would be appreciated.

up
0 users have voted.

mary deling

Submitted by gkroberts on

Hi Mary,
Depending on the type of event it is, I use the following 2 evaluation methods:


First - Quick and Easy (typically collect responses via sticky notes - see attached image)
1) What did you like about the training/meeting?
2) What would you change about the next training/meeting?

Second - Thorough Evaluation (surveymonkey or one pager)
Please indicate your agreement with the statements below:
1) The content of the meeting/event was relevant and useful
Response scale: (Extremely disagree, disagree, agree, extremely agree)
2) The duration of the meeting/event was sufficient
Response scale: (Extremely disagree, disagree, agree, extremely agree)
3) What did you like best about the meeting/event?
4) What can be improved for the next meeting/event?
5) I would recommend this training/event to others.
Response scale: (Extremely disagree, disagree, agree, extremely agree)
6) Additional Comments:

up
0 users have voted.

Gurleen Roberts, MPH
Director of Quality Management
Cobb & Douglas Public Health
Marietta, GA
gurleen.roberts@dph.ga.gov

Grace Gorenflo's picture
Submitted by Grace Gorenflo on
The Kirkpatrick model to evaluate trainings is simple and straightforward. You can find peer-reviewed articles about its effectiveness, but I think this link is the most helpful: https://www.mindtools.com/pages/article/kirkpatrick.htm. Step 4, in particular, lends itself to RBA questions if you plan to do another evaluation some time after the training.
up
0 users have voted.

Submitted by tkane on

This is something that one of my colleagues, Sonja Armbruster from Wichita State University, picked up from the Midwestern Public Health Training Center. The following 4 questions are asked with a 5-point likert scale (Strongly Disagree, Somewhat Disagree, Neither Agree nor Disagree, Somewhat Agree, Strongly Agree): 

  1. My understanding of the subject matter has improved as a result of having participated in this training.
  2. I have identified actions I will take to apply information I learned from this training to my work.
  3. I will use at least one thing that I learned in this session in my work.
  4. I was satisfied with the training overall.

We also include the following open-ended questions: 

  1. What was most useful about this training/meeting?
  2. What was least useful about this training/meeting?

These six questions make for a short evaluation but typically yield helpful feedback, particularly the 2 open-ended questions. 

Ty Kane

up
0 users have voted.

Submitted by johnshutze on

We've moved away from survey/evaluations using a scale or rating. We didn't find them helpful in modifying our training. Instead, we've taken an approach referenced in the attachment as better way to hear the voice of the customer and with great success. It can be summed up with four questions:

  1. What did you like about the training?
  2. How might we improve the training (constructive criticism)?
  3. What questions did the training raise?
  4. What ideas did the training spark?

It came from the Design Thinking toolkit which is available for download at: 

https://dschool.stanford.edu/resources/the-bootcamp-bootleg

Attachments: 
up
0 users have voted.

Submitted by deli65 on

Thanks everyone for the great response and ideas.

 

up
0 users have voted.

mary deling