Learning Analytics Expert Workshop

 

20160315_100136.jpg

At the beginning of the day Riina Vuorikari, JRC-IPTS provide a description about the goals of the project.

Why is JRC-IPTS organising this? JRCs are the Joint Research Centre of the European Commission, the in-house research centres. The IPTS centre is The Institute for Prospective Technological Studies.

The project is intended to provide evidence based scientific and technical support throughout the policy cycle at the European level.

 

 

 

There are a set of lightning presentations which Doug Clow did an excellent job of live blogging at the following post in the section called European Lightning presentations.  Each presenter was an expert in the field of learning analytics and presenting briefly about their respective work.

 

Erik.png

Rebecca Ferguson welcomes everyone to the event. At the start the community takes a moment to acknowledge the loss of Erik Duval.  Doug Clow provides a nice tribute to Erik where we remembers how Erik put an emphasis on family life while managing a demanding academic career.  Erik will be a missed member of the Learning Analytics community

 

Rebecca then proceeds to introduce the Learning Analytics European Policy (LAEP) project.  She talks about the three research questions for the project

  1. What’s the state of the art in Learning Analytics?
  2. What are the prospects for implementation?
  3. What’s the potential for European policy?
20160315_140911.jpg

Adam Cooper

From these research questions the project took on work.  Adam Cooper put together a Glossary of Terms so that people new to the field could become familiar with the work.  Rebecca requested that anyone who has input on missing terms or suggested improvements for definitions to please provide feedback on the Cloud Works site.

Then a literature review was conducted by Adam Cooper on Learning Analytics with a focus on implementation. Rebecca mentions that there have been many literature reviews on the topic, but this is the only one with a focus on implementation.

 

Then the project created a set of inventories of policy, practice, and tools.  Andrew Brasher provides an overview of the 14 policies, 13 practices, and 19 tools covered in the inventory.

J&G.png

There are six case studies that are covered by Jenna Mittelmeier & Garron Hillaire.  The presentation outlines that these cases reflect a variety of company types and the key take aways are not a meant to be a checklist of how make effective policies, but rather facets of the challenges policy makers face when working on Learning Analytics Policies.  After introducing the organizations of each of the cases a slide is shown that illustrates key take aways for policy makers

 

20160315_144109.jpg

 

Then workshop where we worked as groups to develop 6 positions and rotated discussions in order to develop and refine the ideas.

Then one person from each group presented on the groups idea and the rest of the group rotated around the room to hear ideas from the other groups.

The goal was to prepare the person to provide a position of what the future of learning analytics will become.  then we got the entire group in one large circle.  The people who had just practiced pitching their ideas then was asked to presented the refined idea to the entire group.

 

 

As people presented their ideas to the entire group anyone could challenge the ideas to continue to refine and improve the concepts.  It was fun to hear people present and defend the ideas of the workshop

20160315_150544.jpg

Inventory of Learning Analytics Tools, Practices, and Policy Documents – Call for Nominations

The LEAP Inventory is being developed as a broad-but-shallow collection of informative reference points will be used to inform the study and to illustrate the points identified in it, as well as being a useful resource for the learning analytics community at large. The LAEP Project team will be doing its own desk research, but we would value your nomination of examples you think we should inclde.

The ask: Please add a comment or “like” someone else’s comment. All we need is a few words and a URL, but if you want to say why the example is interesting, please do. Suggestions don’t have to be from your own experience, but if they are and you would be happy for us to contact you for more information, please say so.

The offer: A draft version of the Inventory will be published for open comment early in 2016, and the final report we will prepare for the Institute for Prospective Technological Studies will also be published as an open access resource.

What Will the Inventory Contain

We are seeking examples that are representative of the breadth of the current global state of learning analytics across the three categories: tools, practices, and policy documents. The LAEP Project is, however, concerned with the practical adoption of learning analytics, as opposed to theoretical, technical, conceptual, or highly-contextualised, etc, research. We are particularly interested in examples which do not appear in academic journals or conference proceedings, since these are often less easily found.

We are looking for examples of learning analytics tools, which might be:
a general analytics tool – a tool which is not specialised to learning analytics but which has been used in a learning analytics context;
a learning environment tool – a tool which relates closely to guiding a learning activity, typically informing users who then choose how to act;
a smart system – a tool which is adaptive;
a student-support tool – a tool oriented to student support other than in relation to the acquisition of knowledge, skills, or competence;
a design and planning tool – a tool which supports curriculum or learning design, or a related aspect of the environment in which learning is promoted;
… or a kind of tool we haven’t thought of.

We are looking for examples of learning analytics practices, which we interpret broadly to include:
a pilot – refers to cases where multiple stakeholders are engaged in implementing learning analytics across multiple discrete contexts, although it may be that only the initial steps of implementation have been begun;
an example at scale – learning analytics is being practiced at scale and has been practiced for sufficient duration to reveal strengths and weaknesses, although only within a single context;
a candidate for mainstreaming – substantially-similar learning analytics practices have been replicated across different contexts, for example quite different types of learner or into different organisations.

Policy documents which are of interest might be:
formal policies – documents typically issued by government agencies, local authorities, or individual institutions which define policy in relation to learning analytics;
good practice advice – less formal documents which might also be referred to as “best practices” or “code of practice” for undertaking learning analytics, and which might be of an interim status such as a “draft for discussion”;
adoption/implementation advice – documents which advise on the process of adoption of learning analytics, i.e. which refer to change management aspects (in contrast to advice related to undertaking learning analytics);
strategy-level white papers – documents which are informative and which are aimed at stakeholders in any part of the education or training landscape who are involved with strategy formulation and implementation;
analysis of policy-related issues – conference reports, white papers or research which explicitly considers the policy space of learning analytics adoption.

All of the above are just examples of the kind of thing we are looking for; do feel free to suggest others.

The Ask

Please add a comment or “like” someone else’s comment. All we need is a few words and a URL, but if you want to say why the example is interesting, please do. Suggestions don’t have to be from your own experience, but if they are and you would be happy for us to contact you for more information, please say so.

Thanks in advance!

Learning analytics for European educational policy – LAEP

The LAEP team is exploring the implications and opportunities of learning
analytics for European educational policy.

We are investigating the state of the art of learning analytics in Europe and beyond. We are also looking at what may happen in the field during the next 10–15 years.

To do this, we are putting together an inventory of evidence as well as carrying out in-depth case studies and expert consultations.

The project will provide recommendations for European education policy to guide and support the take-up and adaptation of this technology to enhance education in Europe.

For more details, contact rebecca.ferguson@open.ac.uk