On evaluation design

The fifth set of principles are possibly the most difficult to put in place. Up to now every previous principle put in place has led to a whole set of different data, from different sources, that just happen to be around, contributed by and perhaps analysed by, a lot of different people. At this stage, it could be seen to be a bit of a mess.

However, that’s where the skill of the evaluator comes into its own. It’s taking these disparate sets of data, and looking for commonalities, differences, comparisons, and even single case studies that stand out and elucidate an area on their own. The strength of having such disparate sets of data are that they are:

#5.1 eclectic, multimodal, mixed methodologically

However, it’s still necessary to put a minimum (remember, light touch) more robust evaluation in place at the core, in the form of a survey/questionnaire etc. This needs to contain a pre- and post-test and be open to quantitative analysis (some people only take numbers seriously). This runs against the idea of aligned with practice and opportunistic, as it’s an imposed minimum participation, but I think as long as it’s not too onerous, it’s not too much to ask. Usually though, this is the bit that requires the most struggle to get done.

So… #5.2 quantitative comparative analysis, demanding minimum imposed involvement from practitioners to complete, provides an essential safeguard to the research to ensure robustness

However, this is not the only robust aspect. Even though the remainder of the data are opportunistic, because they are so wide-ranging they will inevitably provide qualitative data in sufficient quantity (and be triangulated), that this would in itself be an effective evaluation. It’s just good to have some numbers in there too.

To make the best of these elements, post-hoc, is the most difficult aspect of this style of evaluation, and requires a bit of time just sifting through everything and working out what it is you’ve actually got. Allow a week without actually getting anything concrete done. It’s OK, it’s just part of the process. It requires the evaluator synthesise the findings from each set of data and therefore to be…

#5.3 flexible, creative, patient

As Douglas Adams once said (though he was quoting Gene Fowler), “Writing is easy. All you do is stare at a blank sheet of paper until drops of blood form on your forehead.”

Tags: / / / /

About the author

Mark Childs

As Senior Lecturer in Technology Enhanced Learning, Mark’s role is to help deliver the Technology Enhanced Learning Framework across Oxford Brookes and to support OCSLD and its staff with their online presence.

Mark’s career in Higher Education has two complementary strands, as a researcher in TEL since 1997 and as an educational developer in TEL since 2003. He has worked at the University of Wolverhampton, the University of Warwick and Coventry University. Between 2011 and arriving at Brookes in 2015 Mark worked as a “freelance academic” providing educational research, consultancy and training for a range of clients including the Open University, Hewlett Packard, The Field Museum of Natural History in Chicago, Ravensbourne College, Worcester University and the Tablet Academy where he is currently its research director.

Mark’s educational development work is informed by, and provides a goal for, his research. The core of these research interests is the use of a wide-range of synchronous communication platforms for education, including social media, videoconferencing, virtual worlds and games-based learning. His most recent work is in the area of online collaboration for design using social media and videoconferencing, where he has evaluated the learner experience of students in distributed teams in projects led by Loughborough University and by CARNet in Zagreb, Croatia. In parallel to this he has a wide-ranging interest in many other fields of research; for example, his most recent publication is a DVD with the OU on Ethiopia’s progress towards reducing child and maternal mortality.

You may also like...