by The Planet
I am reflecting on week one of the Learning and Knowledge Analytics 2011 (#LAK11) open online course. I have spent the week reading the articles assigned by the course leaders, exploring the Moodle forum where discussions are taking place, listening in on the Eluminate session which featured guest speaker John Fritz from UMBC, keeping an eye on the LAK11 hashtag on Twitter, and discussing LAK11 with my colleague Andrew Deacon here at the Centre for Educational Technology. The latter is a tremendous benefit as having a colleague also interested in the course is a great way of sharing and negotiating ideas through casual discussion.
This week I am going to look at the various definitions of learning and knowledge analytics, explore how institutions are using educational analytics, and unpack some of the concerns that have been raised in the readings.
Educational analytics defined
Educational analytics have been defined as:
The measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. (Learning Analytics 2011 Conference site: https://tekri.athabascau.ca/analytics)
The intersection of technology, management culture, and the application of information to manage the academic enterprise. (Goldstein, 2005)
Benchmarking and evaluating teaching activities and interpreting the relationship between observed student online behaviour and implemented pedagogical practice. (Dawson, 2010)
Exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings which they learn in. (Baker & Yacef, 2009)
The merging of “data collation with statistical techniques and predictive modelling that can be subsequently used to inform both pedagogical practice and policy.” (Macfadyen & Dawson, 2010)
Academic analytics can be used to profile and even predict students who may be at risk, by analyzing demographic and performance data of former students. (Fritz, 2011)
Extracting intelligence from data
The notion of educational analytics is an attempt to capitalize on all of the data which we either deliberately or inadvertently collect in the process of educating our students. I would hazard to say that in nearly any higher education course today there is some activity occurring online; within institutional learning management systems (LMS), course registration systems, content management systems (CMS), enterprise resource management (ERP) systems, course blogs/websites, forums, etc. Educational analytics is an attempt to understand the transactions, events, records, and occurrences which exist within these systems in order to improve educational outcomes.
This interest in educational analytics is largely inspired by the successes of business intelligence (BI). Companies have been able to analyze their own large databases of sales and other information to be able to predict and strategize for the future. BI in the corporate sector is a huge industry with companies like SAP, SAS, Oracle, SQL Server all having BI offerings.
Depth of analytics
Goldstein (2005) defined five stages in understanding the depth in which an institution (corporate or education) applies analytics:
Stage 1: Extraction and reporting of transaction level data (representative)
e.g. Report on the number of students registered in a class
Stage 2: Analysis and monitoring of operational performance (monitoring)
e.g. Report on student dropouts in relation to course results
Stage 3: “What-if” decision support such as scenario building (proactive)
e.g. Report on estimated number of graduates based on cohort size in first year
Stage 4: Predictive modelling & simulation (forecasting)
e.g. Report on potentially at risk students based on historical student performance data.
Stage 5: Automatic triggers of business processes (automated)
e.g. Automatic email generated to student as a result of not accessing a resource, interacting online, etc.
Goldstein (2005) and Fritz (2009) note that the majority of current analytic activity in the education sector is focused on stage 1, extraction and reporting of transaction level data. This is largely due to the fact that;
- Most reporting to government bodies and national agencies is representative and historic
- There are high level skills associated with reporting, analyzing and digesting data
- People don’t always have access to the data they would need to begin analysis
- Data is scattered amongst multiple systems (ERP, LMS, CMS, web analytics) with no centralized repository
Most university data centres are protected and controlled, so gaining access to LMS, CMS, and ERP systems can be a difficult process often involving multiple departments. Perhaps the current trend towards open and public data will help shift data centres policy, but of course privacy and disclosure issues may be difficult to circumvent.
Perhaps one of the unique challenges to the education sector in harnessing BI is that we are so often running more than one system. In my own experience I have seen many institutions having at least both an ERP system and LMS. Maybe it’s just an excuse but I tend to think that industry has been slightly more successful with BI as they are more likely to have all of their data in one system, the ERP. (I’m happy to be challenged on this
Integrating cloud data in analytics
What makes things even more challenging is the amount of academic social and collaborative activity occurring in the cloud. Students may be using Facebook or internet messaging to connect and chat about an assignment instead of using the institutional LMS. So we don’t have access to the data which is recorded, but Facebook will and therefore be able to place an ad for textbooks, or other study opportunities on their Facebook home page. Unifying data from various institutional systems will be challenging enough, amalgamating the data from cloud services such as Google Docs, Facebook, Skype, Twitter, Google Analytics and others presents entirely new challenges!
Many institutions have spent the last few years improving their technology infrastructure and online services, they may be now ready to turn their attention towards making use of the data these systems collect. Educational analytics will likely have to make use of data warehousing as current academic systems are not unified and will have to be woven together in central repositories.
The need for warehousing of data was echoed by John Fritz in his example of using analytics at UMBC. While trying to query the live LMS to display personalized data as a user logs in, they managed to crash the system. Since then they have resorted to setting up a data warehouse which combines LMS activity data with student grade and bio data and are able to query that directly when needed.
So while it is true we have a wealth of data at our institutions, most of us do not yet have a centralized place to unify data and extract intelligence from multiple data sources. Many universities have established Management Information Offices, Institutional Research Centres, and Planning Departments which may be the best place to unite all of the institutions’ data.
Extract, transform, load
The distributed nature of data today requires that we will likely need to reassemble data from various sources in order to move beyond stage 2 of Goldstein’s (2005) analytics framework. Data will have to be extracted from various systems, transformed into a unified structure so that it can be analyzed against other data, and then loaded into a system in which it can be analyzed. Many people, myself included, do this with something like excel, SPSS, or another statistical package, but it can be automated and done with much more sophistication using data warehousing tools.
Some inspiration and the opportunities going forward
Learning is a product of interaction. Depending on the epistemology underlying the learning design, learners might interact with instructors and tutors, with content and/or with other people. Many educators expend enormous amounts of effort to designing their learning to maximize the value of those interactions. Regardless of the approach taken, a series of questions consistently arises: How effective is the course? Is it meeting the needs of the students? How can the needs of learners be better supported? What interactions are effective? How can they be further improved?
Elias, 2011; p. 1
The information on student behaviour captured by the LMS has been so rarely interrogated and adopted beyond basic load and tool usage. The quantity and diversity of data available regarding student online learning behaviour, interactions with peers and teaching staff and access to other institutional ICT systems (student services, library, etc.) for example, affords an opportunity for integrating automated student learning support and services.
Dawson et al., 2010; p.121
Research opportunities aplenty!
My colleague and I are brainstorming the possibility of doing a study of our own institutional LMS data and looking for indicators of success based on activity online. We will share the results as we explore.
Baker, S.J.D., Yacef, K. (2009) The State of Educational Data Mining in 2009: A Review and Future Visions: http://www.educationaldatamining.org/JEDM/images/articles/vol1/issue1/JEDMVol1Issue1_BakerYacef.pdf
Dawson, S. (2010) ‘Seeing’ the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, Vol. 41, No. 5. (2010), pp. 736-752.
Elias, T. (2011) Learning Analytics: Definitions, Processes, Potential http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf
Fritz, J. (2011) Learning Analytics. Presentation prepared for Learning and Knowledge Analytics course 2011 (LAK11). http://www.slideshare.net/BCcampus/learning-analytics-fritz
Goldstein, P. J. (2005) Academic Analytics: Uses of Management Information and Technology in Higher Education http://net.educause.edu/ir/library/pdf/ecar_so/ers/ers0508/EKF0508.pdf
Macfayden, L. & Dawson, S. (2009) Mining LMS data to develop an ‘‘early warning system” for educators: A proof of concept. Computers & Education Vol. 54, p. 588-599