An evaluation framework for Massive Open Online Courses for Professional Development (MOOCs4PD): the case of the Learn2Analyze MOOC
Μέθοδος αξιολόγησης μαζικών ανοικτών διαδικτυακών μαθημάτων για επαγγελματική ανάπτυξη: η περίπτωση του Learn2Analyze MOOC

View/ Open
Keywords
Evaluation of MOOCs ; Perceived competenced advancement ; Educational Data Literacy Competence Profile ; Learn2Analyze MOOC ; MOOCs for professional developmentAbstract
Massive Open Online Courses (MOOCs) can be a valuable tool for professional development (PD) as they can offer flexible and cost-effective opportunities for professional competence development at large scale. Nevertheless, certain shortcomings are reported in the literature for MOOCs4PD, such as low completion rates, limited engagement and social participation and lack of credible assessment, mostly inherited by the design of MOOCs targeting the general audience.
This thesis contributes to the discussion of the evaluation of MOOCs and proposes an evaluation framework for MOOCs4PD based on the learners’ perceived competences advancement. The aim of the evaluation is to explore the factors which affect the perceived competence advancement of participants, focusing on the learners’ profile and the reported learning experience upon completion. The core question of the evaluation is:
“What are the areas of possible improvement for the offered competence-based Professional Development MOOC to better the quality of the learning experience and effectively cultivate the competences of participants?”
By means of validation, a successful application of this methodological framework is presented. More specifically, the proposed evaluation framework is applied to the case of the Learn2Analyze MOOC, a competence based MOOC4PD for online education professionals, aiming to support the development of the basic competences for Educational Data Analytics of Online and Blended teaching and learning.