Into the wild – Technology for open educational resources

Reflections on three years of the UK OER Programmes.


Between 2009 and 2012 the Higher Education Funding Council funded a series of programmes to encourage higher education institutions in the UK to release existing educational content as Open Educational Resources. The HEFCE funded UK OER Programme was run and managed by the JISC and the Higher Education Academy. The JISC CETIS “OER Technology Support Project” provided support for technical innovation across this programme. This book synthesises and reflects on the approaches taken and lessons learnt across the Programme and by the Support Project.

This book is not intended as a beginners guide or a technical manual, instead it is an expert synthesis of the key technical issues arising from a national publicly-funded programme. It is intended for people working with technology to support the creation, management, dissemination and tracking of open educational resources, and particularly those who design digital infrastructure and services at institutional and national level.

Availability

Published by University of Bolton, Deane Road, Bolton, BL3 5AB

ISBN: 978-0-907311-35-5 (print on demand: book (£3.36) printed by Lulu; or free pdf to print yourself)
ISBN: 978-0-907311-36-2 (ebook, Kindle: free download; or from Amazon (77p))
ISBN: 978-0-907311-37-9 (ebook, ePub: free download)
ISBN: 978-0-907311-38-6 (ebook, pdf: free download)
(All prices are the minimum for the distribution channel)

Licence and source

Creative Commons Licence Into the wild – Technology for open educational resources by Amber Thomas, Lorna M. Campbell, Phil Barker and Martin Hawksey (Eds) is licensed under a Creative Commons Attribution 3.0 Unported License.

You are free to share (to copy, distribute and transmit the work) to remix (to adapt the work) and to make commercial use of the work under the proviso that you attribute the origin of the work (if possible please include the title, the names of the editors / authors and a link to this page).

To help you re-use this work editable formats are available. We originally wrote the book using the BookType, an online collaborative authoring and publishing platform. Booktype will allow you to clone our source, contact Phil Barker if you would like to do so. There is also a Word .docx file that we used for the final published versions.

Errors and bugs?

There are some minor bugs in some versions: bullet points don’t display well on the kindle version, reference links are erratic on the ePub version (more for some readers than others), the images on the print pdf have white lines on them. We hope none of these are serious problems for you. If you do find a serious problem please contact Phil Barker.

CETIS Analytics Series: The impact of analytics in Higher Education on academic practice

CETIS Analytics Series Vol 1, No 10. Analytics for Teaching Practice (pdf)
CETIS Analytics Series Vol 1, No 10. Analytics for Teaching Practice (MS Word .docx)

Many strong claims have been made for Learning Analytics and the potential which it has to transform the education system, which deserve to be treated with caution, particularly as they regard teaching practice.

The introduction of these techniques cannot be understood in isolation from the methods of educational management as they have grown up over the past two centuries. These methods are conditioned by the fact that educational managers are limited in their capability to monitor and act upon the range of states which are taken up by teachers and learners in their learning activities. Strategies for simplification have been developed which classify the range of knowledge as a number of subjects, reduce the subjects to courses, and assign students to cohorts which carry out the same activities. Teachers, meanwhile, deal as best they can with the full variety of learners’ needs in their practice. Over the years, an accommodation has developed between regulatory authorities, management and teaching professionals: educational managers indicate the goals which teachers and learners should work towards, provide a framework for them to act within, and ensure that the results of their activity meet some minimum standards. The rest is left up to the professional skills of teachers and the ethical integrity of both teachers and learners.

This accommodation has been eroded by the efforts of successive governments to increase their control over the education received by both school and higher education students. Learning Analytics radically reduces the effort involved in gathering information on the way in which lecturers deliver the curriculum, and also to automate the work of analysing this information. An alliance of these two trends has the potential to constrain teaching practice, and therefore it is necessary to take a systemic view when assessing the impact of analytics on teaching practice.

Three types of analytics intervention are discussed, in terms of their impact on practice.

  • efficiency in the wider functioning of the institution, which has few implications for teaching practice,
  • enhanced regulation of the teaching and learning environment, which has potentially negative impact on teaching practice,
  • methods and tools intended to help lecturers carry out their tasks more effectively, which have the potential to be a useful tool in teaching practice.

It is concluded that Learning Analytics should not be seen as a short cut to providing teaching professionals with universal advice on ‘what works’, and that its use to increase the accountability of teachers to management may have unintended negative consequences. Rather, the most promising area for enhancing teaching practice is the creation of applications which help teachers identify which of the many interventions open to them are most worthy of their attention, as part of an on-going collaborative inquiry into effective practice.

Continue reading

CETIS Analytics Series: A Brief History of Analytics

Link: CETIS Analytics Series vol 1, No 9. A Brief History of Analytics (pdf)
Link: CETIS Analytics Series vol 1, No 9. A Brief History of Analytics (MS Word .docx)

The potential of analytics according to this definition is to help us to evaluate past actions and to estimate the potential of future actions, so to make better decisions and adopt more effective strategies as organisations or individuals. Analytics allows us to increase the degree to which our choices are based on evidence rather than myth, prejudice or anecdote.

Several factors are coming together at the moment to stimulate interest in making more use of analytics. One of these is the increased availability, detail, volume and variety of data from the near-ubiquitous use of ICT (Information and Communication Technology) throughout almost all facets of our lives. This aspect tends to be the focus of the news media but data alone is not enough to realise benefits from analytics. A less popularised factor driving effective exploitation of analytics is the rich array and maturity of techniques for data analysis; a skilled analyst now has many disciplines to draw inspiration from and many tools in their toolbox. Finally, the increased pressure on business and educational organisations to be more efficient and better at what they do adds the third leg to the stool: data, techniques, need.

This paper, one of the CETIS Analytics Series, is aimed at readers who wish to be introduced to the range of techniques that are being pieced together and labelled as Analytics. It does this by outlining some of the most important communities – each with their own origins, techniques, areas of limitation and typical question types – and suggests how they are contributing to the future, with special reference to the context of post-compulsory education.

The diversity and flexibility of some of the techniques lined up under the analytics flag is evidenced by the numerous different applications of analytics: financial markets, sports analytics, econometrics, product pricing and yield maximisation, fraud, crime detection, spam email filters, marketing, customer segmentation, organisational efficiency and even tracking the spread of infectious disease from web searches i. Behind these applications we can find the roots of analytics in the birth of statistics in the eighteenth century but since then different applications of statistics and IT have led to different communities of practice that now seem to be merging together. We see that Web Analytics pioneers are now expl oiting data from the “social web” by using Social Network Analysis and that the techniques of Information Visualisation are supporting interactive and exploratory forms of analysis rather than just the graphs in management reports. Subjects that some see a s old-hat such as Operational Research and others that are often perceived as futuristic such as Artificial Intelligence are each making contributions in surprising ways. Meanwhile, the education community has made its own contributions; Social Network Analysis and Artificial Intelligence have both emerged from academic research and we are now beginning to see sector – specific variants of analytics being put to work in the form of Educational Data Mining, Learning Analytics and bibliometrics.
Continue reading

CETIS Analytics Series: Institutional Readiness for Analytics

Link: CETIS Analytics Series Vol 1, No 8. Institutional Readiness for Analytics (pdf)
Link: CETIS Analytics Series Vol 1, No 8. Institutional Readiness for Analytics (docx)

This briefing paper is written for managers and early adopters in further and higher education who are thinking about how they can build capability in their institution to make better use of data that is held on their IT systems about the organisation and provision of the student experience. It will be of interest to institutions developing plans, those charged with the provision of analytical data, and administrators or academics who wish to use data to inform their decision making. The document identifies the capabilities that individuals and institutions need to initiate, execute, and act upon analytical intelligence.

For the purpose of this paper, the term Learning Analytics (LA) is used to cover these activities using the definition of:

Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. (CETIS, 2012)

The proposition behind learning analytics is not new. In the school sector particularly, good teaching practice has long involved record keeping with pen and paper and the analysis and reflection on this data to inform courses of action, and more recently using technology. Similarly, in different ways, all higher education (HE) and further education (FE) institutions use data to inform their decision making in assessment boards and course committees. However, as institutions increasingly use technology to mediate, monitor, and describe teaching, learning and assessment through Virtual Learning Evironments (VLEs) and other systems, it becomes possible to develop ‘second generation’ learning analytics. The large data sets being acquired are increasingly amenable to new techniques and tools that lower the technical and cost barrier of undertaking analytics. This allows institutions to experiment with data to gain insight, to improve the student learning experience and student outcomes, and identify improvements in efficiencies and effectiveness of provision.

Continue reading

CETIS Analytics Series: A Framework of Characteristics for Analytics

Link: CETIS Analytics Series Vol 1, No 7. A Framework of Characteristics for Analytics (pdf)
Link: CETIS Analytics Series Vol 1, No 7. A Framework of Characteristics for Analytics (MS Word docx)

This paper, the seventh in the CETIS Analytics Series, considers one way to explore similarities, differences, strengths, weaknesses, opportunities, etc of actual or proposed applications of analytics. It is a framework for asking questions about the high level decisions embedded within a given application of analytics and assessing the match to real world concerns. The Framework of Characteristics is not a technical framework.

This is not an introduction to analytics; rather it is aimed at strategists and innovators in post-compulsory education sector who have appreciated the potential for analytics in their organisation and who are considering commissioning or procuring an analytics service or system that is fit for their own context.

The framework is conceived for two kinds of use:

  1. Exploring the underlying features and generally-implicit assumptions in existing applications of analytics. In this case, the aim might be to better comprehend the state of the art in analytics and the relevance of analytics methods from other industries, or to inspect candidates for procurement with greater rigour.
  2. Considering how to make the transition from a desire to target an issue in a more analytical way to a high level description of a pilot to reach the target. In this case, the framework provides a starting-point template for the production of a design rationale in an analytics project, whether in-house or commissioned. Alternatively it might lead to a conclusion that significant problems might arise in targeting the issue with analytics.

In both of these cases, the framework is an aid to clarify or expose assumptions and so to help its user challenge or confirm them.

Continue reading

CETIS Analytics Series: Analytics for Understanding Research

Link: CETIS Analytics Series Vol 1. No 4. Analytics for Understanding Research (pdf)
Link: CETIS Analytics Series Vol 1. No 4. Analytics for Understanding Research (MS Word .docx)

Analytics seeks to expose meaningful patterns in data. In this paper, we are concerned with analytics as applied to the process and outputs of research. The general aim is to help optimise research processes and deliver improved research results.

Analytics is the use of mathematical and algorithmic methods to describe part of the real world, reducing real-world complexity to a more easily understandable form. The users of analytics seek to use the outputs of analytics to better understand that part of the world; often to inform planning and decision-making processes. Applied to research, the aim of analytics is to aid in understanding research in order to better undertake processes of planning, development, support, enactment, assessment and management of research.

Analytics has had a relatively a long history in relation to research: the landmark development of citation-based analytics was approximately fifty years ago. Since then the field has developed considerably, both as a result of the development of new forms of analytics, and, recently, in response to new opportunities for analytics offered by the Web.

Exciting new forms of analytics are in development. These include methods to visualise research for comparison and planning purposes, new methods – altmetrics – that exploit information about the dissemination of research that may be extracted from the Web, and social network and semantic analysis. These methods offer to markedly broaden the application areas of analytics.

The view here is that the use of analytics to understand research is a given part of contemporaneous research, at researcher, research group, institution, national and international levels. Given the fundamental importance of assessment of research and the role that analytics may play, it is of paramount importance for the future of research to construct institutional and national assessment frameworks that use analytics appropriately.

Evidence-based impact agendas are increasingly permeating research, and adding extra impetus to the development and adoption of analytics. Analytics that are used for the assessment of impact are of concern to individual researchers, research groups, universities (and other institutions), cross-institutional groups, funding bodies and governments. UK universities are likely to increase their adoption of Current Research Information Systems (CRIS) that track and summarise data describing research within a university. At the same time, there is also discussion of increased ‘professionalisation’ of research management at an institutional level, which in part refers to increasing standardisation of the profession and its practices across institutions.

The impetus to assess research is, for these and other social, economic and organisational reasons, inevitable. In such a situation, reduction of research to ‘easily understandable’ numbers is attractive, and there is a consequent danger of over-reliance on analytic results without seeing the larger picture.

With an increased impetus to assess research, it seems likely that individual researchers, research groups, departments and universities will start to adopt practices of research reputation management. However, the use of analytics to understand research is an area fraught with difficulties that include questions about the adequacy of proxies, validity of statistical methods, understanding of indicators and metrics obtained by analytics, and the practical use of those indicators and metrics in helping to develop, support, assess and manage research.

To use analytics effectively, one must at least understand some of these aspects of analytics, and certainly understand the limitations of different analytic approaches. Researchers, research managers and senior staff might benefit from analytics awareness and training events.

Various opportunities and attendant risks are discussed in section 5. The busy reader might care to read that section before (or instead of) any others.
Continue reading