Tracker case studies

This page provides links to 7 full case studies from the 2016 pilot of the Tracker, and 14 institutional snap-shots from the 2017 pilot. The case studies provide useful background on why we developed the Tracker and how it is of benefit overall. The snap-shots refer to a more up-to-date question set and process, and they look at specific practical issues in use of the Tracker – from first steps to creating organisational change.

In addition, the online Guide to Responding to your Findings includes useful tips about how institutions have responded to each question.

Snap-shots (2017)

This report, Insights from Institutional Pilots (July 2017), covers many of the practical issues involved in running the tracker.

  • How did Liverpool University get a high response rate from its HE students?
  • How did the University of Adelaide engage students in dialogue about their digital experience?
  • How did Epping Forest College (EFC) engage their learners as change agents?
  • How did the University of Northampton use the Tracker to secure institutional actions and investments?
  • How did Bexhill Sixth Form College use the Tracker to benchmark and compare?
  • How have Durham University used free text responses to identify what students want?
  • How did ACL Essex use the Tracker successfully with ACL (Adult and Community Learning) and WBL (Work-Based Learning) learners?
  • How was the Tracker used across a consortium of University Colleges in London?
  • How was the Tracker used to support an international, cross-university research project?
  • How did RMIT (Melbourne, Australia) use the Tracker to find out more about students’ digital learning?
  • How did Nottingham Ningbo overcome initial resistance to make the Tracker an on-going project?
  • How is the University of Derby using the Tracker to support curriculum change in HE?
  • How did Ulster come back from a low response rate to run a successful Tracker the second time around? You can read their reflections on their use of the Tracker here
  • How has the University of Stirling changed its approach to student feedback?
  • How did North Lindsey College use the Tracker to enhance the digital curriculum?

Case studies (2016)

The pilot implementation and evaluation of the Tracker (March-June 2016) involved 24 institutions, split evenly between HE and FE/skills providers. As part of that process we explored the experience of 7 institutions in more detail. Each case study reviews the rationale for engaging with the Tracker in the first place, the process of stakeholder engagement and implementation, the findings, and how the organisation has responded. We would like to thank all the case study institutions for their participation in this process and for their willingness to share their experiences.

Summary of key lessons learned (2016)

Rationale

  • The Tracker can provide colleges with a valuable source of evidence for submitting to Ofsted and in support of area reviews, to demonstrate how their learners’ views are informing the development of the digital environment.
  • There is real value in demonstrating to staff that decisions about digital learning are based on evidence – and for staff to see that their efforts are being recognised and are meeting learners’ needs.
  • Involving the Students Union in discussions about the digital environment raises awareness among student representatives of the role technology plays in the overall learning experience: it can lead to better student engagement around digital issues.
  • There is no other credible, valid and comparable instrument available to assess the digital experience of learners.

Implementation and student engagement

  • Survey fatigue is an issue at many campuses.
  • A high response rate can be achieved using a combination of methods, persistent reminders, and varying the message to students e.g. posting findings as they emerge and asking ‘do you agree?’ or ‘what do you think?’
  • A targeted sampling strategy allows a site to be sure of meeting its targets for completion and to distribute responses in a planned way across different course areas. It also reduces the likelihood of the sample being biased towards students with more digital awareness and confidence.
  • Targeting student representatives in a survey of this kind has several benefits: a higher response rate, a more representative spread of subject areas and stages of study, and the assurance of working with engaged students who have things to say. There are some risks, in that student reps are a self-selected group who may differ from other students in ways that are significant for the findings of the Tracker.
  • In some settings, cultural issues may lead to biased sampling (e.g. where only the most digitally confident and engaged feel able to participate). Solutions include use of paper versions and using student champions to complete surveys on behalf of other students in face-to-face settings.
  • The Tracker is straightforward to deliver to work-based and other off-campus learners because of the use of a single URL to an open web site.

Using the Tracker with other approaches

  • The Tracker could be used in conjunction with a personal discovery tool to give learners the feeling that they were being supported and progressed as well as surveyed about their digital experience.
  • The Tracker can be used effectively alongside internal surveys and other annual processes (IT services review etc) to build a more detailed picture of what students expect from their digital experience.
  • The Tracker could be tied in more closely with other surveys. This would also give the message that digital issues are part of the mainstream learning experience. For example, if there are no questions in module evaluation forms about the digital experience, this gives both students and staff the message that it doesn’t matter. A separate instrument can pick up added detail, but there still needs to be effort to embed digital issues into other measures.

Analysis

  • Sector benchmarking is a useful property of the Tracker data. Because it is interesting to senior managers, benchmarking gives the Tracker evidence and process a higher profile and better chance of impact. It also allows for focus on areas of particular strength and weakness.
  • Tracker data can be used to identify issues that require further exploration in focus groups and Learner Voice initiatives.
  • With a large enough sample, Tracker data can be used to identify local trends and differences, and probe these findings in detail e.g. across different subject areas, year groups, or campus cohorts.
  • A large sample size is likely to reveal significant variations between subject areas, but more work will be needed to understand whether this is related to differences in the student cohorts, and/or in subject requirements, or whether it represents inequalities in provision that need to be addressed.
  • Tracker data can be used as a credible source of evidence in formulating strategies for digital learning and the digital environment.
  • Open questions are very popular and should not be ignored in analysis. They can be analysed quickly e.g. using word clouds, word sifts and word counts, and in more detail e.g. using qualitative analysis software, theory building, or with the help of student representatives and focus groups.

Different groups of learners

  • Work-based learners have specific concerns in their digital experience, which the Tracker can help organisations to understand and address.
  • Technology plays a pivotal role in supporting learners in the workplace. These learners can have a fragmented experience of technology use, but the Tracker allows them to see their digital learning in the round and to comment on it in a safe way.
  • Designing inclusive digital learning experiences is an important consideration for all institutions: the Tracker offers data to support this agenda e.g. how many students feel adequately supported to modify their devices and interfaces to meet their needs.
  • With adaptations for assistive software, the Tracker can be used to bridge mainstream and specialist colleges, allowing for a direct comparison of the digital experience of disabled students in different settings.
  • In some settings, cultural issues dominate students’ experience of digital technology.

Future development of the Tracker

  • There was widespread support for the Tracker to become an ongoing service, and many sites have already put in place plans to implement it again. In some cases there are plans to integrate it into student induction, or into course evaluation processes.
  • Future iterations of the Tracker need to be tailored to reflect the academic environment of sixth form colleges, where each student is likely to be taking a wider range of subjects.
  • Input from Specialist Colleges has been invaluable in ensuring the entire BOS system – and not just the Tracker – is more accessible to all.