On Wednesday this week we ran a webinar for colleges and universities that were involved in the pilot implementation of the Tracker. Eight attended, along with several Jisc account managers representing other pilot sites. Then on Thursday around 50 members of the Jisc Student Experience Experts’ Group had the opportunity to explore the same findings.
In both cases we shared headlines from the data report, summarising almost 11000 data sets from our 24 representative pilot sites – 12 in HE and 12 in FE/skills including 2 Specialist Colleges. We also reviewed how pilot sites found the experience, and asked for more feedback to help Jisc decide on the future of the Tracker.
Slides from the Experts’ group meeting are openly available from slideshare. Please be aware that these headline findings lack the detail and explanation that you can read in the full report.
From the webinar we learned that:
- Most institutions are still analysing and discussing their results
- All have engaged senior stakeholders, some with the result that Tracker findings are feeding directly into new strategies
- Nearly all have a plan for responding to findings with the engagement of learners (7 of 8)
- Many have undertaken or are planning focus groups to explore specific issues in more depth
- All want to repeat the Tracker again in the future (8 of 8)
- All recommend that Jisc make the Tracker available as a full service (8 of 8)
- The benchmarking capability of the survey – delivered in BOS – is highly valued
- Minor edits were suggested to the Tracker questions
By a small margin, institutions wanted to keep the core set of questions but to add their own. Keeping the core set but having access to further common, optional questions (authored and tested by Jisc) was also popular. None wanted complete freedom to adapt and create questions because of the importance of benchmarking and of using validated, pre-tested questions.
From the Experts’ group meeting we confirmed several of these findings. For example, delegates wanted more questions about digital technologies in the curriculum. They also wanted the chance to ask in more detail about learners’ independent/informal use of digital tools and media. They were interested in the possibility of linking Tracker data with data from learning analytics. And they were very keen on longitudinal use of the Tracker as well as for benchmarking.
Having heard more about the Tracker experience from Northampton, one of the pilot sites, delegates were overwhelmingly in favour of having access to the Tracker themselves (see show of hands below). They were asked to consider what questions they would particularly like to ask of their own students, and to suggest ways in which the Tracker might be implemented and used to benefit their institution. The results were collated and are available here. A storify of the whole meeting and a periscope recording of the Tracker session are also available.
We are now compiling a small number of institutional case studies, after which the Tracker pilot will be complete. Thank you again to all the sites who have worked to hard to get us this far.