process picture

This process takes a three step approach.

STEP 1- Design your evaluation around a research question and select appropriate ways to capture student voice – TOOL KIT can give you inspiration

STEP 2 – Carry out data collection – this may involve several touchpoints

STEP 3 – Analyse your evaluative data – being consistent in your approach – and share best practice

click step links to skip to that step


1 DESIGN

This stage involves developing a research question – are you looking at student learning? or experience? for example; what is it you want to try out and why? have you changed something and want to see if it worked? It takes into account data you may already be getting (eg via tutorials, attainment data) alongside information you need to gather to ensure you answer your question – to find out what works and why.

A questionnaire can be found here that may be helpful as you plan your design: EVALUATION DESIGN

Some key principles

*You can’t evaluate something effectively without focusing on a specific question

*You need to capture the starting point so you can compare progress at the end

*You want to identify what it is about your plan/practice that is working

*Bespoke the design for one-off sessions to larger scale projects and units

*Embed it into teaching sessions for better responses

*Consider what success will look like – what indicators will you look for?

You might design an evaluation to look like this:

Design stages:

1 Decide an outline timetable (depending on the type of unit you are exploring)

2 Consider your starting point

3 Plan times for data collection touchpoints

4 Select what those touch points are (see below) – generally keep these short: 5-10 mins embedded in your sessions

5 Add in a reflection moment so you capture your own thinking en-route (see below)

6 What will your final data set be? (is this your unit evaluation or a different sort of summary)

7 Ensure data collection points are integrated and not to complex or take too long.

8 Plan how you will analyse these different data sets (see below)

9 Consider how you will use and share what you find out


2 DATA COLLECTION

Plan out the ways you want to collect information at your touchpoints. From your design, think about the sort of data you need to get (eg is about how students are enjoying things? or is it about their confidence working towards an assessment?).

Some key principles

*Embed these in sessions as far as possible so they are part of the teaching

*Make them meaningful to students – eg they could be reflection moments for students as they assess their progress, or ways for them to be creative

*Engage students in conversations about why this helps – they are part of the evaluation – not having it ‘done’ to them

*Your own reflection is important – capture your thinking in the moment and consider pivots if necessary

*Use different sorts of ways to capture student voices – to keep it varied and inclusive

*Customise approaches to your projects and try things out – keep this creative

Carry out your various stages of data collection and keep all your data carefully, ready to collate in the final stage.


3 ANALYSIS AND SHARING

This is an important stage to ensure your various types of data are analysed consistently, so your evaluation is robust.

Key principles are:

*analysis ensures your evaluation is robust by adding structure and consistency to your data

*student voices can be gathered in all sorts of formats – the analysis stage brings it altogether

*spending time analysing the information means you can understand what works and why

*it is an iterative approach as you develop your pedagogic practice

*it allows you to evidence your work and share your best practice easily

Coding is a good way to think about this – marking up or colour coding your various data sets (whether advice cards, miro boards, surveys, dashboards) with a set of themes. These themes will be the things you identified in your evaluation design; and additional themes will emerge through the data collection.

You can create padlets or documents that collate themes from the different sorts of data and from there you can spot patterns and make links to draw together a narrative for each theme. This will build on what you may have noticed going on, but make it more tangible, linked to evidence. If you need help on this aspect please contact f.hall@lcc.arts.ac.uk – as I can show you some examples of how to do this.

Once you have summarised your findings – you should be able to see what worked/didn’t work about your test/experiment/unit.

You can then plan what you might do next time – the whole process is iterative and you can keep doing what works and change what isn’t working.

This questionnaire can help you with the final summary of your evaluation.

To share the findings you can do a variety of things – present at course or programme meetings, as well as teaching and learning sessions in college or across UAL; this evaluation website itself is a place to capture your key findings too – talk to me about that f.hall@lcc.arts.ac.uk – we’d love to include your evaluation as we build a website collecting good practice.

This is a way help you evaluate your practice in a formal and iterative way – that informs your teaching going forward and also provides good evidence to support your thinking and to understand and illustrate what is effective.

DO CONTACT ME WITH ANY QUESTIONS OR COMMENTS – f.hall@lcc.arts.ac.uk

back to top