Evaluation – how to avoid the “not knowing what you don’t know” trap

Dr Jarlath O’Donohoe instructs on life support courses in many different countries (he has lots of embroidered shirts to prove it).  He is the educator on our Generic Instructor Courses and has been doing a lot of work on our course evaluation forms.  Here are some recent thoughts from him on “pre-post evaluation”:
“No one likes to waste their time. In developing country health services this is even more important since human resources are so scarce. Equally important is to avoid thinking you are wasting your time when you are not. In evaluating a training exercise it seems obvious that asking questions before and then after the training will help identify what is worth doing and what is not worth doing.  
However, like in so many other spheres of life, what is obvious is not always true and what is true is not always obvious. It turns out (Academic Emergency Medicine: Educational Advance: Bhanj F, Gottesman  de Grave W.  The Retrospective Pre–Post: A Practical Method to Evaluate Learning from an Educational Program, Feb 2012) that asking questions at both the beginning and end may fail to identify useful learning. The example given is of someone who thinks himself quite knowledgeable at the beginning of a course and scores himself 7/10.  Then, having learned a lot more about the topic, he again scores himself 7/10 at the end.  There has been a lot of learning and the 10 at the end of the course is a much bigger 10 and therefore the 7 is a much bigger 7.  This can not be shown statistically.
The term the authors use is retrospective pre-post (RPP) evaluation. Experience has shown some people scoring themselves as highly capable at the beginning of our training sessions but who are not able to do things like bag and mask ventilation. So we have moved to an RPP approach to evaluation.”
This entails giving our learners just one questionnaire at the end of the course which asks them how confident they felt before the course in certain skills and areas of knowledge and how confident they feel at the end of the course.  It seems that recall bias might be a lesser evil than “not knowing what you don’t know”.