Categories
Tracker

Making use of your tracker data: a gardener’s guide

As the last of the snow melts (we hope!) and the daffodils straighten up for spring, you probably also have a fresh crop of tracker data blooming in your BOS dashboard.

Spring Daffodils Flowers

Think of this blog post as your gardners’ guide. (Note that benchmarking data will not be available until after the UK trackers are closed on 30 April. So even if you have data to play with now, it’s worth going back to the garden in early May and downloading the benchmark data. This is the contrasting foliage that will show your flowers off to their best effect.)

Arrangement 1: as nature intended
BOS produces readable charts for all the closed questions. The default .pdf file is not the most beautiful presentation, but it is good enough for many uses. You could use the Question by Question Guide to identify which stakeholders are likely to be interested in which question(s), and make sure they see the relevant results. Showing the right data to the right people is the most important thing to do.

For a more advanced arrangement, download all the closed data as a .csv file and open in excel. This gives you a wider range of charting options. A special commendation if you choose a colour swatch that matches your organisation’s branding.

Arrangement 2: focus on a few signature blooms
In each of the four areas of tracker, why not choose one or two sub-questions that are of particular interest to your organisation, or that differentiate your student respondents in an interesting way.

For example:

  • You and your digital: Q5d (‘How often do you use digital tools or apps to look for additional resources not recommended by your tutor/lecturer‘)
  • Digital at your institution: Q9a (‘My institution supports me to use my own digital devices’)
  • Digital on your course: Q14f (‘As part of your course, how often do you produce work in digital formats other than Word/Powerpoint?’)
  • Attitude to digital learning: Q22: (‘How much would you like digital technologies to be used on your course?’)

We recommend looking at the results for the two new summary questions: overall scores of institutional digital provision (Q13) and quality of digital teaching and learning (Q18). If you want to run more advanced statistical tests, these two metrics can be compared with other indicators to look for trends and contrasting results.

Arrangement 3: show off your free-text data
big_06abb998f3181af0372fad5b173032c3In our experience the free text questions provide specific, actionable detail at institution level. It’s worth downloading and focusing your analysis on these. Coding or marking up your data does not need special software. You can use colour highlighting, or group your responses into themes. It helps to have more than one person do this so you can compare results. Two useful questions are:

  • Q11 ‘To improve your experience of digital teaching and learning… what one thing should we DO?’
  • Q14a ‘Please give an example of a course digital activity that you have found really useful’

This is an arrangement that students will really appreciate, so invite them to view it at the first opportunity.

Arrangement 4: foreground local species

Some questions provide data that is very particular to your institution. The free text questions are one example: so are the questions about the digital environment (‘digital at your institution’). Questions about digital course activities and the VLE can be localised further by partitioning responses according to students’ broad subject area (if you used this question). daffs2Your custom questions are also likely to touch on local issues. Try focusing your report on those issues that are local and unique.

Some of the other questions are better analysed at a sector level. For example, the Jisc team and expert panels will be looking at broad trends in issues such as personal device use and attitudes to digital learning. We will explore the impact of gender, stage of course, and organisational type, as well as some other organisational factors. These findings are likely to be relevant across institutions, and the analysis will be more robust across a larger sample size. So, unless these questions are of specific interest to you – or you have good reason to believe that your institution is different to the norm – feel free to leave this work to us. National sector reports will be published in or before September 2018.

Arrangement 5: colour-by-numbers

Over the next month we will be developing short slide decks and formatted single-page templates that allow you to plug in your data and present the findings quickly and efficiently. We will email to let you know when these are available. We’d appreciate knowing what formats would work best for you – please share your ideas on the jiscmail list.

Some technical advice

This year we have removed all ‘don’t know‘ options from the agreement scale questions. If you want to compare your percentage ‘agree‘ figures with last year’s results, remember to adjust last year’s figure first by removing the ‘don’t know’s and re-calculating the percentage based on just the agree/disagree figures.

If you want to get into more advanced analysis, or just understand what your data means in more detail, we offer a wealth of practical advice in the Guide to analysing your data and the Question by Question Guide, both available at bit.ly/trackerguide.

Leave a Reply

Your email address will not be published. Required fields are marked *