Categories
Uncategorized

UCISA digital capabilities survey

Happy new year! At last I have got around to a blog post about the outcomes of the 2014 UCISA survey on digital capabilities (this link is to the Executive Summary report – full report in a few weeks’ time). Publication of the exec sum was accompanied by a webinar in December which looked at some of the implications for strategy and future directions, the digital environment including ‘bring your own’, and new practices in and alongside the curriculum. There is also a ning discussion community  where further comment and discussion are taking place. The findings and discussion are interesting in their own right, so here I’m just going to point up some of the ways they tie in with our own outcomes and challenge areas.UCISA surveyFirst, staff who are thinking of implementing change – whether that is to the digital environment, support services, or the curriculum – appear most influenced by student expectations and by findings from research. Given that most of the research in this area relates to the student experience, it seems that staff are (or think they are) very responsive to messages from students about their digital requirements. This may tie in with the ‘student satisfaction’ agenda more broadly, or it may be that digital technology is an area in which staff think students have something uniquely valuable to contribute.

Jisc will be pleased to know that the Digital Literacies infokit and other outputs from the Developing Digital Literacies programme have had the most influence on staff awareness and behaviour. There is an appetite for institutional auditing/benchmarking in this area – in fact many institutions seem to have taken part in the belief that the survey itself would help them benchmark their current standing. UCISA points to existing audit resources from the DDL programme to support this. But clearly it would suit the majority of institutions to have access to an up-to-date and rigorously maintained, consensually developed and properly piloted audit tool, and to take part in an iterative and shared process of auditing digital capabilities at a strategic level. The UCISA survey can help raise awareness of developments across institutions, but it was never intended to support detailed information gathering inside institutions, and this is where the challenge lies. There is a link surely with the appetite for ‘research findings’ – and we have argued from the start of this project that the greatest value to institutions will come when they have proven means for undertaking their own research into the student experience.

With regard to definitions of digital literacy, fluency or capability, there is widespread agreement with the need for a common understanding of terms, and a strong steer for Jisc and UCISA to collaborate on this. But over and above this there was agreement from commentators that we need good examples of definitions in practice: frameworks, statements of graduate outcomes, disciplinary benchmarks etc. Our current conceptions also need to be updated to acknowledge the prevalence of mobile technologies, apps, and touch screen interfaces.

It seems likely that all three of these issues – consensual definitions and frameworks, audit tools and data clearing, and credible research – may be addressed by the new Jisc challenge area of Building Digital Capabilities. Meanwhile I must turn my attention to the 50 or so magnificent institutional exemplars which I have received and written up since November. Big thanks to everyone involved, and please can I blame the number and quality of examples for the fact that I am late in getting them online.

 

Leave a Reply

Your email address will not be published. Required fields are marked *