Massive Open Online Courses (MOOCs) are taking off in big ways, but their value is still being parsed. MOOCs have been advanced by several top universities including Standford, MIT, and Harvard and the numbers of users are in the millions. This type of participation creates an immense amount of data which is ripe for study. Interestingly, data collected thus far concludes that completion rates of these massive courses range only between 5% and 12% (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 241).
In a recent article from Educational Researcher, 16 Coursera courses taught by the University of Pennsylvania were studied to understand the progression of users through a MOOC (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014). The authors operate from the idea that students must complete a series of steps to achieve an educational outcome. With that in mind, the authors identified whether users progressed through the course sequentially or on their own terms. In addition, they worked to understand key milestones that might help predict which users complete the course. It's important to note that the authors define the difference between a registrant and a starter because "the representation of starters among registrants varied considerably across courses, ranging from 53% to 95%" (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 425). This means that of those that registered for the course didn't always start the course, so the term user is often too broad.
So, how did users progress through these 16 Coursera courses? Most treated it like a typical course and walked through the material sequentially. A small amount created their own path through the course, taking advantage of the materials being open and online. What's interesting to me about those that created their own path through the course, is that their retention rates were between 1% and 4% higher than those who went through the course sequentially (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 425). This interests me in my work on blended learning at the secondary level. What I'm curious about is whether allowing students to have control and a more personalized experience (i.e., creating their own path through the course) improves outcomes more generally. What we know from this study, is that students were slightly more likely to complete the course, but what else could this mean for a more personalized educational experience and how can this translate to a public secondary school?
When it comes to milestones to course completion, accessing at least one lecture was a milestone, but there was a significant drop off (23%) between the first and second module (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 425). An even more significant milestone was attempting a quiz, but only 1 in 5 students did so (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 426). Thinking about online or partially online courses, it begs the question of how to get students engaged in the content. Of course, student completion of a MOOC may not be the end goal. MOOC users are often simply enriching their lives, so perhaps they are satisifed with perusing course content as they please. But, how do teachers and course designers get students more engaged in courses so that they devour the content, rather than lose interest and drop out? These are important questions.
This study had millions of data points, but as the authors note in their closing thoughts, "'big data' is insufficient (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p.429). User experience is important to understand. Another teacher and I will be walking through our first Coursera course in a few weeks and in an effort to contribute to the literature, I plan to journal through the experience as a sort of pilot to the potential of understanding user experience. You may find that journal here! So, check back!
In a recent article from Educational Researcher, 16 Coursera courses taught by the University of Pennsylvania were studied to understand the progression of users through a MOOC (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014). The authors operate from the idea that students must complete a series of steps to achieve an educational outcome. With that in mind, the authors identified whether users progressed through the course sequentially or on their own terms. In addition, they worked to understand key milestones that might help predict which users complete the course. It's important to note that the authors define the difference between a registrant and a starter because "the representation of starters among registrants varied considerably across courses, ranging from 53% to 95%" (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 425). This means that of those that registered for the course didn't always start the course, so the term user is often too broad.
So, how did users progress through these 16 Coursera courses? Most treated it like a typical course and walked through the material sequentially. A small amount created their own path through the course, taking advantage of the materials being open and online. What's interesting to me about those that created their own path through the course, is that their retention rates were between 1% and 4% higher than those who went through the course sequentially (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 425). This interests me in my work on blended learning at the secondary level. What I'm curious about is whether allowing students to have control and a more personalized experience (i.e., creating their own path through the course) improves outcomes more generally. What we know from this study, is that students were slightly more likely to complete the course, but what else could this mean for a more personalized educational experience and how can this translate to a public secondary school?
When it comes to milestones to course completion, accessing at least one lecture was a milestone, but there was a significant drop off (23%) between the first and second module (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 425). An even more significant milestone was attempting a quiz, but only 1 in 5 students did so (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p. 426). Thinking about online or partially online courses, it begs the question of how to get students engaged in the content. Of course, student completion of a MOOC may not be the end goal. MOOC users are often simply enriching their lives, so perhaps they are satisifed with perusing course content as they please. But, how do teachers and course designers get students more engaged in courses so that they devour the content, rather than lose interest and drop out? These are important questions.
This study had millions of data points, but as the authors note in their closing thoughts, "'big data' is insufficient (Perna, Ruby, Boruch, Wang, Scull, Seher, & Evans, 2014, p.429). User experience is important to understand. Another teacher and I will be walking through our first Coursera course in a few weeks and in an effort to contribute to the literature, I plan to journal through the experience as a sort of pilot to the potential of understanding user experience. You may find that journal here! So, check back!