Deze website gebruikt cookies (en daarmee vergelijkbare technieken) om het bezoek voor u nog makkelijker en persoonlijker te maken. Met deze cookies kunnen wij en derde partijen uw internetgedrag binnen en buiten onze website volgen en verzamelen.
Hiermee kunnen wij en derde partijen advertenties aanpassen aan uw interesses en kunt u informatie delen via social media.
Klik op 'Ik ga akkoord' om cookies te accepteren en direct door te gaan naar de website of klik op om uw voorkeuren voor cookies te wijzigen. Bekijk onze privacyverklaring voor meer informatie.
OW_HTMLscherm_head_large.jpg
Trusted Learning Analytics

The main objective of learning analytics is to unveil previously hidden information from educational data to gain new insights and to provide feedback about these gained insights to different educational stakeholders (e.g. learners, teachers, parents, and managers).

This information can support individual learning and enhance teaching quality, but can also improve organisational knowledge about management processes as well as system administration. Learning analytics involves various pre-processing steps from basic information models to the structuring of data, e.g. data harvesting, data storing, data cleaning, data anonymisation, data mining, data analysis, and data visualisation. 
Dr. Drachsler, Hendrik

Background

In the last decade the amount of user and usage data and the development of open and linked data sources has created new challenges for society. Learning analytics (LA) is a research field that aims at understanding the potential and limitations of big data for learning support. Despite the great enthusiasm currently surrounding the field of learning analytics, there are substantial questions to be answered by research. Along with technical research questions such as the scalability of the infrastructure, compatibility of educational datasets, the comparability and adequacy of algorithmic, and appropriate visualisation technologies, there are also other problem areas that influence the acceptance and the impact of LA. Among these are questions of privacy, data ownership, transparency of the analytic process, ethical use and dangers of abuse as well as the demand for new key competences to interpret and act on learning analytics results and the need for LA-supported instruction methods.


Data science in education has been coined ‘Learning Analytics’, an umbrella term for research questions from overlapping research domains such as educational science, computer science and data science. The use of data to inform decision-making in education and training is not new but the scope and scale of its potential impact for teaching and learning has increased by orders of magnitude over the last few years. We are now at a stage where data can be automatically harvested at previously unimagined levels of granularity and variety. Analysis of these data has the potential to provide evidence-based insights into learner abilities and patterns of behaviour that in turn can provide crucial insights to guide curriculum design, to improve outcomes for all learners, to change assessment from mainly summative to more formative assessments, and to thus contribute to national and European economic and social well-being.


Despite the great success surrounding learning analytics and various (startup) companies that jumped on the bandwagon and provided learning analytics tools, most learning analytics strategies of educational organisations are still at the initial phases of implementing learning analytics. Considering the five-step sophistication model developed by the Society of Learning Analytics Research (SoLAR) (Siemens et al., 2013) there is still a lot of work to be done in order to transform the educational sector into a data-driven educational science. We therefore conduct research according to the Learning Analytics Framework by Greller & Drachsler (2012) and are guided by the following research questions. 

Leading research questions

  • What kind of data models are most supportive for educational research and practice?
  • How can LA data be stored, processed and used to create valuable/useful tools for the various educational stakeholders (teachers, students, parents, managers)?
  • How can the stakeholders (teachers, students, parents, managers) be supported with personalised information based on the LA data?
  • How can existing ethics and privacy guidelines be applied for the uptake of LA in Europe?
  • Which additional competences are needed for educational stakeholders (teachers, students, parents, managers) to deal with the affordance of LA tools?
  • How can LA information be used with existing instructional design methods?

People

On 22 September 2017 Maren Scheffel defended her PdD thesis titled: 'The evaluation framework for learning analytics'
Promotor was prof. dr. M.M. Specht, co-promotor: prof. dr. H. Drachsler
Read the press release (in Dutch): Learning analytics tools in onderwijs beter evalueren en vergelijken.
Or watch the video:

Projects

  • Competen-SEA
    The COMPETEN-SEA project is a Capacity Building in Higher Education (CBHE) project funded by the Erasmus+ program of the European Commission. The main objective is to enable South-East Asian Universities to develop accessible, affordable, high quality and effective educational services to various population groups now excluded from traditional educational outreach. 
    European project
  • SHEILA
    To assist European universities to become more mature users and custodians of digital data about their students as they learn online, the SHEILA project will build a policy development framework that promotes formative assessment and personalised learning, by taking advantage of direct engagement of stakeholders in the development process.
    European project
  • SafePAT
    The objective of the SafePAT project is to improve patient safety by reducing risks associated with patient admission, transfer and discharge, through connecting and standardising best practic examples in the Euregio Meuse-Rhine (EMR).  
    Interreg project
  • The Evaluation Framework for Learning Analytics EFLA
    The Evaluation Framework for Learning Analytics (EFLA) addresses the current lack of evaluation instruments by offering a standardised way to evaluate learning analytics tools and to measure and compare the impact of learning analytics on the educational practices of learners and teachers. 
    Research project
  • The Dutch xAPI specification
    We developed a complete overview of xAPI statements that various xAPI projects have implemented to enable the deployment of customised Learning Analytics solutions. With this inter-project and inter-institutional specification of xAPI we aim to stimulate a national Dutch xAPI movement and also to contribute to the international definition of usage of xAPI specifications around the world.
    Research project
  • The Trusted LA Infrastructure
  • Ethics and Privacy in LA – EP4LA
    This interactive workshop series aim to raise awareness of major ethics and privacy issues. It is also used to develop practical solutions for learning analytics researchers and practitioners that will enable them to advance the application of learning analytics technologies. 
    Research project
  • Multimodal Learning Analytics
    This research investigates the potential of collecting and analysing multimodal learning data such as hand movements, gaze or physiological information through wearable sensors and IoT devices. These data can be used for automatic formative assessment and for emphatic agents that adapt the learning experience based on an individual learner’s characteristics, activity or context.
    PhD research Daniele di Mitri
  • Learner Dashboards to support self-regulated learning
    Learning analytics dashboards are tools that support learners in making informed decisions about their learning. However, most of the existing dashboards follow a “one size fits all” philosophy disregarding individual differences between learners. This PhD project researches a learning dashboard that caters to the individual needs of learners throughout the self-regulated learning process.
    PhD research Ioanna Jivet
  • Multimodal learning analytics for collaborative learning
    This research is based on using multimodal learning analytics (MMLA) in collaborative learning scenarios, e.g. collaborative problem solving, collaborative programming, etc. The goal is to use sensors to detect different multimodal cues which indicate collaboration and then facilitate/ improve collaboration by using real-time interventions supported by MMLA.
    PhD research Sambit Praharaj
  • SURF SIG LA
    Special interest group

Completed projects

  • LACE: Learning Analytics Community Exchange
    European project (7th Framework Programme)
    The project aimed to create a community of people interested in learning analytics and educational data mining. The idea behind this community was to build bridges between research, policy and practice to realise the potential of learning analytics and educational data mining in Europe.
    LACE is now a special interest group under SoLAR. 
  • ECO 
    ECO was a European project based on Open Educational Resources (OER) that gave free access to a list of MOOC (Massive Open Online Courses) in 6 languages.
    The main goal was to broaden access to education and to improve the quality and cost-effectiveness of teaching and learning in Europe.

  • LinkedUP
    European project (7th Framework Programme)
    The project aimed to push forward the exploitation of public, open data available on the Web, in particular for education. In order to do that, LinkedUp organised a LinkedUp Challenge: the project challenged the educational world to realise personalised university degree-level education of global impact based on open Web data and information.

  • Open Discovery Space
    European project (7th Framework Programme)

    The project aimed to develop a socially-powered and multilingual open learning infrastructure to boost the adoption of eLearning resources. The interface was designed with students, teachers, parents and policy makers in mind.

Publications

Key publications:

  • Jivet, I., Scheffel, M., Drachsler, H. & Specht, M. (to appear). License to evaluate: Preparing learning analytics dashboards for the educational practice. Proceedings of the Eighth International Conference on Learning Analytics and Knowledge, LAK’18. 
  • Suárez, A., Ternier, S., Helbig, R., & Specht, M. (2017, 30 October). DojoAnalytics: A Learning Analytics interoperable component for DojoIBL. In F. Loizides, G. Papadopoulos, & N. Souleles (Eds.), Proceedings of the 16th World Conference on Mobile and Contextual Learning - mLearn 2017 (pp. 1-8). Larnaca, Cyprus. ACM 2017. http://dspace.ou.nl/handle/1820/8816 
  • Scheffel, M. (2017, 22 September). The Evaluation Framework for Learning Analytics. Doctoral thesis. Heerlen, The Netherlands: Open Universiteit (Welten Institute, Research Centre for Learning, Teaching and Technology)  http://dspace.ou.nl/handle/1820/8259 
  • Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness Is Not Enough: Pitfalls of Learning Analytics Dashboards in the Educational Practice. In E. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning (EC-TEL 2017), LNCS: Vol. 10474 (pp. 82-96). Springer, Cham. http://dspace.ou.nl/handle/1820/7985 
  • Scheffel, M., Drachsler, H., Toisoul, C., Ternier, S., & Specht, M. (2017). The Proof of the Pudding: Examining Validity and Reliability of the Evaluation Framework for Learning Analytics. In E. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning (EC-TEL 2017), LNCS: Vol. 10474 (pp. 194–208). Berlin, Heidelberg: Springer. http://dspace.ou.nl/handle/1820/8390 
  • Di Mitri, D., Scheffel, M., Drachsler, H., Börner, D., Ternier, S., & Specht, M. (2017). Learning pulse: a machine approach for predicting performance in self-regulated learning using multimodal data. In M. Hatala et al. (Eds.) Proceedings of the Seventh International Learning Analytics & Knowledge Conference, LAK ’17 (pp. 188–197). New York, NY, USA: ACM. http://dspace.ou.nl/handle/1820/7394 
  • Scheffel, M., Drachsler, H., Kreijns, K., De Kraker, J., & Specht, M. (2017). Widget, Widget As You Lead, I Am Performing Well Indeed!: Using Results from an Exploratory Offline Study to Inform an Empirical Online Study About a Learning Analytics Widget in a Collaborative Learning Environment. In Proceedings of the Seventh International Conference on Learning Analytics and Knowledge, LAK ’17 (pp. 289–298). New York, NY, USA. ACM. http://dspace.ou.nl/handle/1820/8777 
  • Scheffel, M., Drachsler, H., de Kraker, J., Kreijns, K., Slootmaker, A., & Specht, M. (2017). Widget, widget on the wall, am I performing well at all? IEEE Transactions on Learning Technologies, 10(1), 42-52. http://dspace.ou.nl/handle/1820/7540 
  • Drachsler, H., & Kalz, M. (2016). The MOOC and learning analytics innovation cycle (MOLAC): a reflective summary of ongoing research and its challenges. Journal of Computer Assisted Learning, 32(3), 281-290. http://dspace.ou.nl/handle/1820/6769 
  • Berg, A., Scheffel, M., Hendrik, D., Ternier, S., & Specht, M. (2016). Dutch Cooking with xAPI Recipes: The Good, the Bad and the Consistent. In Proceedings of the International Conference on Advanced Learning Technologies (ICALT’16) (pp. 234–236). http://dspace.ou.nl/handle/1820/7524 
  • Di Mitri, D., Scheffel, M., Drachsler, H., Börner, D., Ternier, S., & Specht, M. (2016). Learning Pulse: Using Wearable Biosensors and Learning Analytics to Investigate and Predict Learning Success in Self-regulated Learning. In R. Martinez-Maldonado & D. Hernandez-Leo (Eds.), Proceedings of the First International Workshop on Learning Analytics Across Physical and Digital Spaces, Vol. 1601 (pp. 34-39): CEUR Proceedings http://dspace.ou.nl/handle/1820/7525 
  • Drachsler, H. & Greller, W. (2016, 25-29 April). Privacy and Analytics – it’s a DELICATE issue. A Checklist to establish trusted Learning Analytics. 6th Learning Analytics and Knowledge Conference 2016, Edinburgh, UK. http://dspace.ou.nl/handle/1820/6381 
  • Pijeira-díaz, H. J., Drachsler, H., Järvelä, S., & Kirschner, P. A. (2016). Investigating collaborative learning success with physiological coupling indices based on electrodermal activity. In Proceedings of the Sixth International Conference on Learning Analytics and Knowledge (LAK'2016) (pp. 64-73). (April 25-29, 2016, Edinburgh, UK). New York, NY, USA: ACM. http://dspace.ou.nl/handle/1820/8682 
  • Drachsler, H., Verbert, K., Santos, O. C., & Manouselis, N. (2015). Panorama of Recommender Systems to Support Learning. In F. Rici, L. Rokach, & B. Shapira (Eds.), 2nd Handbook on Recommender Systems (pp. 421- 451). Springer, US. http://dspace.ou.nl/handle/1820/6276 
  • Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile learning analytics in self-regulated learning. Computers & Education, 89, 53–74. http://dspace.ou.nl/handle/1820/6172 
  • Scheffel, M., Drachsler, H., & Specht, M. (2015). Developing an Evaluation Framework of Quality Indicators for Learning Analytics. In J. Baron, G. Lynch, N. Maziarz, P. Blikstein, A. Merceron, & G. Siemens (Eds.), Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (LAK’15) (pp. 16–20). New York, NY, USA: ACM. http://dspace.ou.nl/handle/1820/7534 
  • Scheffel, M., Drachsler, H., Stoyanov, S., & Specht, M. (2014). Quality indicators for learning analytics. Educational Technology & Society, 17(4), 117–132. http://dspace.ou.nl/handle/1820/5711 
  • Drachsler, H., Stoyanov, S. & Specht, M. (2014, March). The Impact of Learning Analytics on the Dutch Education System. Paper presented at The 4th International Conference on Learning Analytics and Knowledge, Indianapolis, Indiana, USA. http://dspace.ou.nl/handle/1820/5322 
  • Manouselis, N., Verbert, K., Drachsler, H., & Santos, O. C. (Eds.) (2014). Recommender Systems for Technology Enhanced Learning: Research Trends & Applications. Springer. http://dspace.ou.nl/handle/1820/5734 
  • Manouselis, N., Drachsler, H., Verbert, K., & Duval, E. (Eds.) (2012). Recommender Systems for Learning. Berlin, Springer, 2012, 90 p. http://dspace.ou.nl/handle/1820/4647 
  • Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15(3), 42–57. http://dspace.ou.nl/handle/1820/4506 
  • Verbert, K., Manouselis, N., Drachsler, H., & Duval, E. (2012). Dataset-Driven Research to Support Learning and Knowledge Analytics. Educational Technology & Society, 15(3), 133–148. http://dspace.ou.nl/handle/1820/4636
  • Drachsler, H., & Greller, W. (2012). The pulse of learning analytics. Understandings and expectations from the stakeholders. In S. Buckingham Shum, D. Gasevic, & R. Ferguson (Eds.), 2nd International Conference Learning Analytics & Knowledge (pp. 120-129). April, 29-May, 02, 2012, Vancouver, BC, Canada. http://dspace.ou.nl/handle/1820/3850

Spotlight