Researchers – Vivekanandan Kumar, Steve Harris, Joel Burrows, Clayton Clemens, David Boulanger, Jeremie Seanosky, Rebecca Guillot, Isabelle Guillot
What is unique about Learning Analytics (LA) with respect to Academic Analytics (AA), Intelligent Tutoring Systems (ITS), Educational Data Mining (EDM), Artificial Intelligence (AI), and Data Science (DS)?
Question 1 – Individual Answers
Dr. Vivekanandan Kumar: Data Science is the study of data and its properties. It explores the nature of data as it pertains to a tapestry of datasets, theories, models, algorithms and solutions originating from areas such as Computing, Informatics, Mathematics, Health, Education and Statistics.
Data Analytics is the study about Data Science, as in collecting ‘intelligence’ about Data Science. Data Analytics offers meta analyses of one or more instances of applications of Data Science. Such meta analyses result in measurements of traits such as utility, optimality and customisability, among other findings of ‘unknown common truth’, pertaining to these instances. These complex instances can be termed as ‘analytic tapestries’. For example, a tapestry of Learning Analytics can be conceived of as a combination of a) social interaction data among a certain class of students, b) cultural background of these students, c) a subject matter performance of these students, d) a theory of self-regulation, e) a causal model of self-regulation, and f) algorithms that discover and measure relationships between social interaction, customs and self-regulation.
Each analytic tapestry can be conceived as an independent study on its own with multiple outcomes and potentially multiple underlying models it attempts to discover, learn and/or validate. However, being an analytic tapestry implies that a) underlying models of similar tapestries can be autonomously and ubiquitously meta-analysed to arrive at unknown common truths (to arrive at a common model that these tapestries agree upon) or b) underlying models of dissimilar tapestries can be autonomously and ubiquitously meta-analysed (through measurable relations between well-defined variables belonging to different tapestries) to arrive at the leading edge of knowledge portrayed by those tapestries. That is, being an analytic tapestry empowers all tapestries to autonomously and ubiquitously connect their underlying models with each other, irrespective of the locale of interest of a tapestry, the sample-sets exercised by the tapestry, the subject-matter discipline handled by the tapestry or the analytic techniques employed by the tapestry.
Artificial Intelligence (AI) plays a central role in analytics because, in an autonomous and ubiquitous fashion, it makes the tapestry ‘come alive’ and relate to other tapestries, vet data input from a variety of sources, determines the relevance of a piece of data to a theory, instantiates models, connects models, discovers models, learns models, validates models, validates relations, discovers new parameters and offers remediation.
Learning Analytics is comparable to collecting intelligence on individuals for national security purposes but collects intelligence on students’ learning. Agents/sensors are needed to collect data from various sources at different points in time over various study episodes or activities related to study. The agents must then be able to securely transmit the garnered data and inferred information to the headquarters back in their homeland. The data/information is then transformed into actionable insights. Specific actions will be carried out considering these valuable insights, and the impact of those actions on the ground will be evaluated. Intuitively, this whole process will be downright useless if not done within the proper timeline and enshrined within the greatest secrecy. Secrecy is not only crucial for the safety of the agents in foreign countries, but also to collect authentic and valuable data. The country that succeeds to do it in real time will have the edge over her foe. This notion of analytics is true in other domains such as business analytics, where a company will also have the upper hand if it understands the trends among the customer population and responds in a timely manner. As one can readily see, Learning Analytics is like obtaining intelligence on students’ learning strictly for educational purposes from carefully constructed analytic tapestries. Intelligence is based on any available data that can potentially come from anywhere at anytime, implying that sensors must observe the learner as much possible to seek any opportune data. It is not only a mechanism to target competences of learners but also for cognitive traits such as motivation, grit and so on.
The world of learning is no more about this learner alone, it is about how to project the learner in the real world. In that sense, generating intelligence on students to help them succeed is the key distinctive feature of learning analytics. Whereas Learning Analytics offers intelligence on students, the learning environment that includes teachers and other resources offer data to construct entities such as the domain model, expert model, teaching model, student model, interaction model, feedback model, and so on. While Learning Analytics targets learners from wholistic perspective, Academic Analytics targets the learning environment; Intelligent Tutoring Systems targets pedagogy-enhanced tools to tutor students; Educational Data Mining targets information and insights arising from well-defined and already available datasets. Artificial Intelligence in Education, as an overarching field, encompasses Learning Analytics, Intelligent Tutoring Systems, Educational Data Mining and Smart Learning Environments.
The second distinctive feature of Learning Analytics is the way it targets and collects intrusive data about students. Most of the data collected by Learning Analytics is personal data – study habits, preferences, style, metacognitive traits, demography, physiology data and so on. Learning Analytics explicitly seeks such sensitive data and advocates that the data be collected under the belief that they belong to the student first and foremost and be shared with others only after the student gives explicit permission. The need for the “HTTPA” protocol for data sharing is highly relevant for Learning Analytics. The point is that Learning Analytics seems to boldly go after intrusive data.
Another distinctive feature is the portrayal that Learning Analytics is a recent field while the others belong to established fields. This is particularly true when looking at Intelligent Tutoring Systems, which have been in existence for the last three decades. However, the outcomes of intelligent tutoring have been confined to the laboratories and have failed to make a mainstream level impact on student learning and student performances, while Learning Analytics shows much promise toward mainstream, university-wide applications. In fact, Learning Analytics arrived at the same time while Artificial Intelligence is undergoing a revamp. Terms such as Learning Analytics, Artificial Intelligence in Education, Big Data, and Data Science are quite marketable these days, unlike other names such as Intelligent Tutoring or Educational Data Mining that do not seem to capture the imagination of people for wider deployment. In a way, Learning Analytics connects more with the real world than all the other areas.
The final and a very important distinction is that in all these areas humans do play a part in the decision making. It is called “human in the loop.” Humans make most of the critical decisions and the machine plays a secondary role in the human-in-the-loop approach. Learning Analytics does support the human-in-the-loop approach but in addition it also accommodates just the opposite, a machine-in-the-loop approach. In this approach, decision-making is automated as much as possible. Hence, the machine is tasked with making bigger decisions, leaving humans (such as teachers) to play a secondary role – needing them only when it is necessary. Humans come in as a supplement to the machine-loop. This seems to be a very key distinctive feature of learning analytics.
Joel Burrows: The big thing that separates learning analytics from other fields is its scope. For instance, academic analytics and educational data mining seem more focused at the institutional level, whereas learning analytics is much more personal, it is about trying to optimize the environment for a person. Between academic analytics and educational data mining, data mining seems more interested to the techniques, while academic analytics implies wiring everything up to create some sort of feedback loop where the action on the intelligence is built in. When looking at intelligent tutoring systems, learning analytics is the actual intelligence in intelligent tutoring systems. As for artificial intelligence and data science, they are the techniques that are going to be used by learning analytics. A lot of what learning analytics uses can be built on top of that.
The application of learning analytics to a specific domain (e.g., math, English writing, music) requires significant tuning. Thus, the domain really does matter, but at the same time, the Learning Analytics techniques provide the raw tools that are needed to be applied in the tier domain. At the end, what makes learning analytics special is that it really is focused on improving, optimizing things for the individual learners.
Clayton Clemens: All these disciplines and techniques are part of a learning analytics ecosystem, and each of them represents different inputs and outputs of learning analytics. For instance, educational data mining serves as the base data coming in to a learning analytics ecosystem; academic analytics provides also input data like grades and data on the actual process of academia; and then artificial intelligence and data science are used to produce insights and inferences that are expected from learning analytics systems. Intelligent tutoring systems are also one possible output of all of those given inputs. In addition to these, there are a lot of other areas, fields, or techniques that could be integrated into the field of learning analytics.
Application domains are super important. Different problems in learning analytics require different applications of disciplines, and there is not really a one size fit all. Learning analytics will borrow from all these things in different ways and pick throughout the large scope of each of these fields to achieve its goals. Analytics and mathematics require a very different set of technologies than analytics in language studies, for example. In summary, learning analytics encompasses all these fields in different ways.
Rebecca Guillot: Learning analytics is particularly concerned with the idea of capturing data on the learner and the learning environment daily to help the student to learn more efficiently. All the other fields or disciplines provide extensive data, but these data do not necessarily capture information about the students’ learning processes. Their analyses are descriptive and summative in nature, looking in the past to understand what has happened without the possibility to modify the course of actions to avoid undesirable outcomes. Academic analytics, for example, capture data like grades, attendance, dropouts, course completion, access to course materials, statistics on students, etc., but these data are not directly collected daily as the student learns and from the learning environment. Intelligent tutoring systems provide guidance but not necessarily access to history data for retrospective. Educational data mining and data science are the toolbox for learning analytics, but they do not extract the meaning of the detected learning patterns. Artificial intelligence is useful for learning analytics to capture more data and maybe to replace some human interventions, but it is only a part of learning analytics.
David Boulanger: All these terms seem to form a hierarchy in which the focus becomes narrower and the leveraged technology becomes more targeted down the hierarchy. In many respects, they all overlap, and they do not always have clear-cut distinctions between them. Learning analytics is intelligence and seeks for opportunities to intervene in time to maximize gain. That is the distinction between educational data mining and learning analytics, where educational data mining is more focused on pattern recognition/extraction from educational data without necessarily caring to make the learners or users immediately benefitting from it, while learning analytics adds meaning to statistical data, that otherwise would not produce any value for the learner or teacher. That is where the term “intelligence” appears to make sense. Academic analytics and learning analytics are the opposites on a continuum where academic analytics is concerned with the business side of educational institutions, while learning analytics is directly concerned with the learning and teaching process.
Jeremie Seanosky: Learning analytics tends to focus much more on the student and it is more about analyzing the various learning traces captured from the various learning environments students engage in. The goal of learning analytics is really to help improve cognitive, metacognitive, and task-related skills and to personalize and adapt to the learner’s learning experiences. Learning Analytics is more responsive to students in providing them with immediate or real-time feedback, so they can improve, versus other areas like academic analytics or any systems that focus more on capturing data to be used by administrators or teachers. For instance, in Moodle, there is a lot of data and doing academic analytics on these data will help teachers and other administrators to improve courses in the future, but there is no real direct and real-time impact on the learning experience of students. In other words, students cannot benefit from the data they produced.
Steve Harris: One can get reminded of how social media developed and started to change the way that people communicated. It really turned communications on its head. As for intelligent tutoring systems, educational data mining, and even artificial intelligence, they generally tend to be of a top-down approach, where students are taken through specific tasks that the teacher wants them to do. While tutoring systems can offer some options based on how students are interacting with the system, they are still based on a relatively top-down approach to attaining a goal ultimately. In contrast, learning analytics is almost bottom-up, where it is collecting the intelligence directly from the learner and where these data could be used to improve the overall learning environment and learning process. Thus, the biggest differentiator is the focus on the learner in collecting this intelligence opposed to taking a top-down approach.
Question 1 – Group Discussion
Clayton Clemens: For some researchers, learning analytics seems to have a broad sort of grandiose definition, whereas for others, they seem to have a very specific idea of what learning analytics is and how it is distinctive from other fields. What about the idea that some of these other fields, like AI and data science, being potential tools of learning analytics or considered as its outputs or inputs?
Dr. Vivekanandan Kumar: Most of the top conferences in Artificial Intelligence in Education, EdMedia and Intelligent Tutoring Systems now include sessions on Learning Analytics. There is a special international conference name Learning and Knowledge Analytics. These conference now do acknowledge Learning Analytics as part of their research, but at the same time when looking at conferences on Education or Physics one can see that they are not even aware of this upcoming area of Analytics. That is a real challenge. Data Science and Data Analytics should be investigated under any domain that employs computing as a core part of its research and development.
Steve Harris: This can be translated in the marketing world to web analytics or social media analytics, which tend to be data sources for systems like intelligent ad buying, content management systems, or customer-relation database clustering, that sort of thing. But, if the interpretation of the learning data is not properly in place, and the sensors in place to start collecting the right data, nothing else can be done. It is a fundamental layer.
Dr. Vivekanandan Kumar: On the other hand, people do see Learning Analytics as a marketing platform. Certain educational products, injected with marginal nearing analytic capabilities, seem to ‘guarantee’ success. Plenty of universities are now exploring analytics and showcasing that to attract students.
David Boulanger: According to Schumacher, “Learning analytics provide benefits for all levels of stakeholders in the educational arena: mega-level (governance), macro-level (institution), meso-level (curriculum, teacher/tutor), and micro-level (learner),” are we enclosing too much in the definition of learning analytics? Does learning analytics overlap too much with other fields so that its key characteristics are lost?
Clayton Clemens: The accepted definition of learning analytics came from a 2011 conference which reads: “Learning analytics is the measurement, collection, analysis, and reporting about learners and their contexts for purposes of understanding and optimizing learning in the environments in which it occurs.” In that sense, it is possible to stretch the interpretation to get up to those mega, macro, meso, and micro levels like what Schumacher was talking about, but it must come back down to the learners at some point. Whether learning analytics is taken at that base level, at that learner level, and trickled back upwards to those higher levels as an aggregate, that is one thing, but if things measured about the organization are at the institution level, where they do not really have anything to do with the learners themselves and how their performance is, then it probably would not be part of learning analytics.
David Boulanger: The original Learning Analytics definition is deficient in the sense that all these terms (EDM, ITS, data science, AI, and many more) are encompassed directly or indirectly by this accepted definition, which fails to provide a clear-cut definition of Learning Analytics.
Joel Burrows: Educational data mining and academic analytics seem to be much more at the higher level. Educational data mining might find out that students with tattoos have higher grades. Based on this information, the administration may decide to open a tattoo parlour on the campus. However, that is not learning analytics, that is much more focused at the institution level, whereas learning analytics is much more focused on the actual individual learners and improving, optimizing their learning. Similarly, an intelligent tutoring system may help a student write an essay by performing spelling and grammar checks and assessing the reading level of the text the student writes. However, learning analytics will detect that the student is over-using the conjunction ‘and’ and suggest a tutorial on more sophisticated ways of joining sentences. Based on what the system has learned about the student, it can present the information in a way that is best suited for his/her learning style (audio, video, or text). Although there are many stakeholders involved in education, everything must come back down to making sure that the learning process of those learners is optimized.
Clayton Clemens: It also depends on how many factors that are used and quantified in relation to the performance of learners and how they are learning. At some point, it is questionable whether the causality relationship is still there.
Dr. Vivekanandan Kumar: Learning Analytics is an area that is trying to shape up the development of the brain! That is where the area seems to be heading. Even though how a brain functions cannot be directly influenced, Learning Analytics is attempting to achieve it through indirect means. We should be concerned about the moral responsibility attached with such efforts.
Looking at Intelligent Tutoring Systems, or any technology related to education in the last thirty, forty years, education has remained flat, nothing has stood out to challenge the traditional ‘normal curve’ of performance. The responsibility of student learning and performance lie squarely a responsibility of the student and the student alone. If a student fails, it is the student’s fault. No field of study has ever quantitatively assessed the negative impacts of certain pedagogies, as they are delivered, on student learning and performance. No study has attributed student learning and performance to teaching. If a student failed, would it be possible that the teacher also takes some responsibility and defend why the teacher had no choice but to let the student fail? Learning Analytics appears to have the potential to tackle this discrepancy, mostly because it can look at many factors such as interaction data, physiological data, economic data, sociological data of an individual or a group, and then find causality, hopefully, and help learners and teachers understand the nature of study better and perform better in a much more personalized manner. Learning Analytics is not there yet though.
David Boulanger: Learning analytics is just one face of the coin. Teaching analytics is the other face. Learning does not occur without teaching (or, can it?). How could teaching analytics be defined considering the four key distinct features of LA, that is, 1) learning analytics is intelligence on student, 2) Learning Analytics requires intrusive data, 3) Learning Analytics is a recent field connecting to the real world, and 4) Learning Analytics is a human-in-the-machine-loop?
Dr. Vivekanandan Kumar: Learning is not going to happen by itself, but teaching need not be a major element of learning! Learners can study on their own using any resource of their choice, without teachers. Of course, as of now, learners are responsible for their own learning. Say somebody succeeds or fails, it is the learner’s responsibility. They paid the funds, and then they receive the evaluation at the end of the term. They cannot question why they did not get an A+. They cannot question why a teacher could not guide the student achieve an A+. When thinking about teaching analytics, suppose there is information that specifically identified some gaps in how the student was taught, or that there is reason to worry about the institution or a specific teacher taking the responsibility of not helping the student achieve his or her optimal achievement levels, then one can expect a huge shift in educational policies. Learning Analytics will be in the forefront when such shifts are considered by educational institutions and associated bodies. Teaching Analytics will then offer intelligence on the learning environment.
Question 1 of this article has been published in:
Kumar, V., Harris, S., Burrows, J., Clemens, C., Boulanger, D., Seanosky, J., Guillot, R., & Guillot, I. (2018). Introspection and Prospects of Learning Analytics. In Educational Communications and Technology (Chinese Journal), 2018(2), pp. 10-15.
How do you see Learning Analytics transforming contemporary practices of education now, in the next 5 years and in the next 10 years?
Question 2 – Individual Answers
Dr. Vivekanandan Kumar: In line with the last point discussed, the objective is to make teachers more responsible and make them take responsibility for the grades their students achieve. It is not about the class average here, but an individual student making a claim that he or she was not taught enough, or not given enough pointers or guidelines to succeed and to reach his or her optimal level of learning. These issues may be tackled explicitly with analytics becoming more prominent, and that is the transformation that learning analytics could bring about soon, that is, in the next five years. Students, armed with these explicit data as they become available, will not have to wait until the end of the term to turn around their performances. They will be able to do it ‘just in time’, as they study the course. This would be a major shift in educational practices and policies.
This also corresponds to the so-called notion of ‘No student left behind’ and allows one to address the normal curve phenomenon, where if a teacher takes 100 students, it can be expected that about 30 of them are going to get below average performance. This normal curve phenomenon has been observed in many learning situations and people readily accept that without scrutinizing the data at the individual level. The very allusion to the word “normal” is offensive to students when it comes to learning and performance. The very notion of accepting the normal curve as a norm in learning and performance should be shattered.
Educational institutions operate locally. For example, as an educator, I am only interested in students of my institution and in helping them to do better. It is seen as a competition against other universities, where each institution is fighting against each other by narrowly focusing on how to help its own local students. Similarly, students rather than focusing on their university as a mini world should acquire a global outlook. They should see learning as a global endeavor, and they should have the support to see themselves as a global learner. Learning Analytics can help people do that. Hopefully, Learning Analytics will eventually help students achieve the elusive two-sigma improvement in their learning. This has been a goal for the last three decades, but it has not been achieved in the mainstream.
Jeremie Seanosky: The current practices in education tend to be much more group-oriented and passive, be it K-12 or even higher education, such as the traditional classroom where the teacher teaches all the students at the same time and where students are there just to absorb information. It is not focused on individual students. For example, in the current setting, some students may not be bold enough to ask questions to the teacher, which may disadvantage them more than others who dare to. It can be foreseen that learning analytics will transform education into the provision of more individualized study experiences, where students can learn at their own pace and get the help they specifically need, not just being exposed to the one-size-fits-all course materials. By involving learning analytics in various aspects of education, students will be offered highly personalized interactions with the teacher, making their learning experiences much more motivating, which in its turn will contribute to improving the overall level of education in the population.
Clayton Clemens: This question could be approached by looking at the technologies, and their overall availability, surrounding analytics. It is difficult to provide a precise answer, since so much depends on adoption. The educational sector in general is in isolation and slow to adopt learning analytics and the different learning approaches that it might entail. Nevertheless, within the next five years, as educators will start to be educated on the benefits and as research will develop better and better tools, these can well become part of the mainstream in the classroom. Learning analytics could find a place in the day-to-day operations of an educational institution. If that takes place, then, conservatively speaking, within the next ten years, learning analytics applications could be very well integrated within academia, particularly in distance education institutions, and it may even gain traction in traditional universities in specific applications where it makes sense to do so in those environments. Furthermore, we are living in a place right now where the experts believe that AI is going to skyrocket in its power and potential in the next ten years, and if that is the case, the technology will improve to the point of singularity by 2029. Then the educational experience could be completely revolutionized within the next ten years. Automatic personal tutors might be common, guiding learners individually through their studies in individualized, probably non-traditional ways. Learning analytics will be central to have these agents work.
Rebecca Guillot: To answer that question, it is important to have a clear picture of what learning analytics is to foresee how it could be useful in schools or institutions. Thus, learning analytics is exactly what a private teacher would ideally do with a student, where they would be 24/7 together, for example. The teacher would daily analyze the student’s behavior, learning curve, weaknesses, strengths, learning style, etc., and provide adequate interventions to help the student gain confidence and build competences. The teacher would also provide thoughtful feedback and learning activities perfectly adapted to the student’s mental, emotional, physical, and cognitive states. Learning Analytics will also predict what will happen based on the recorded observations of the student and recommend personalized remedial interventions or more adapted learning materials to the student to change his/her course of actions to make the desirable outcomes occur. Of course, if such a perfect world would exist, where all students could be monitored with such a gifted teacher, in a sense there would be no need for learning analytics. However, since this is not humanly feasible, learning analytics is a good substitute for that, though it will never be the same as the human interaction because humans have a high capacity of detection such as in detecting the mood or emotion of a learner.
On the other side, it is questionable whether learning analytics will really alleviate the teacher’s workload. Learning analytics may be rather a tool to help teachers doing their job well by making them aware of some correlations among their classroom’s and students’ datasets. Otherwise, these correlations might never have been noticed by the teachers, merely because of the large number of students to monitor and the very limited time that they can devote to them. Thus, by having more data on the students and their learning environments, teachers will be able to virtually spend more time with each of them. For example, a math teacher is notified by Learning Analytics that the level of attention of his/her classroom is low and that on average the neurological state of the students in the past hour was indicative of high levels of brain activity, denoting some fatigue among students. This type of insight may help the math teacher to accordingly adapt his/her pedagogy in real time and optimize the students’ learning experiences.
David Boulanger: The key thing is the individuality of the student. Students being unique, it requires unique learning environments and unique learning experiences. By recognizing through learning analytics, the uniqueness and individual characteristics of students, it is also easier to connect and match similar students or similar learning situations. Instead of having, for example, a classroom of dissimilar students, it will be possible to form, mainly in online settings, classrooms with more similar students. This way, they can have a better experience together, and teachers, by having more homogeneous classrooms, will be able to adapt the course material or structure to these specific students. In addition, Learning Analytics could use how a student succeeded in a specific challenge to apply that solution to another similar student who is currently struggling with the same challenge. Learning Analytics will allow high levels of personalization and adaptivity in the future.
By mixing the blockchain technology with learning analytics, it will be possible to give students ownership of their competence portfolio, while protecting the integrity of the students’ learning data and the derived actionable insights. Thus, the student’s competence portfolio will no longer be managed by the institution itself. By securely storing the students’ learning data in a highly distributed fashion, only the student will be empowered to give permission to consult subsets of his/her datasets. The application of the Blockchain technology in education is really in its infancy, but in 5 or 10 years, this is something that could be seen.
Another foreseeable transformation is training students, teachers, and any educational stakeholders as researchers by designing an observational study adapted to the educational constraints and developing an extensive causal model that will reveal which learning variables are manipulable and how to manipulate them to maximize performance. There are so many characteristics that cannot be played with, for example, age or gender, but there are other variables, that students and teachers must be aware of, such as self-regulated learning or grit, that can be “tuned” to improve learning performance. Thus, by building proper observational study designs and designing causal learning models, it is hoped that the educational community will be transformed into a researcher community.
Joel Burrows: Looking at why Learning Analytics is going to transform education may be a good approach to show how it is going to change it. By looking at the last few years, it is remarkable to see how machine learning has just taken such a huge leap and is about to change everything. Even the media has started picking up stories about how automation is coming and is just going to change the workforce, stories like in 20 years truckers probably will not exist anymore. What do such stories mean? That means that there is a huge number of truckers in the world that are going to suddenly need to be re-educated. Automation is going to change the world almost career by career. Careers are just going to suddenly disappear and there is going to have a whole bunch of people with a single job skill set that has become obsolete that will need to pick up new skills. In several decades from now, there may be some beautiful future, where everything is automated, and nobody works, but there is going to be probably a period of several decades, where people constantly need to go back and be re-educated. This will put huge stress on educational institutions because the current approach just is not going to work. Given that, when you finish high school, you go to university or college, you get a massive student debt that follows you for twenty years, how is that going to work when after ten years you must go back to school to learn something new? There will be a lot of pressure from government because the government will have to get involved to make sure that education keeps happening, but there will be a lot of pressure on educational institutions too to be optimized and efficient. This is where learning analytics really has an opportunity to come in and change things radically because learning analytics can optimize education and it can provide these opportunities to educate a huge amount of people, probably much more cheaply than they are being educated right now. Post-secondary institutions will be targeted first because this is where the biggest bang for the buck will be got. Higher education is a more competitive environment with many institutions, making it easier to convince these institutions to try using learning analytics. Elementary and high schools are controlled by provincial ministries or state departments – there are far more of them, and they do not tend to be as competitive, so they may be hesitant to try new things. They may need to see the success of Learning Analytics in higher education first before applying them. There is a huge opportunity for learning analytics here. The field needs to move fast to get placed, but within ten years, these stresses on the educational system will start to appear. A massive number of people will look to go back to school. Learning analytics needs to focus on how these people could be re-educated optimally and efficiently.
Steve Harris: With the collection of analytics across courses, students will build up an education profile that will continue to develop throughout their academic career. These learning profiles can then be used to adjust the learning experience for each individual student, to maximize potential success, by promoting the specific materials and course content that were most successful for past students with similar learning types. Thus, education becomes a much more customized learning path for students as they continue with their studies. In fact, this should not be a huge jump. A lot of this happens in marketing right now. For example, Facebook is collecting all sorts of demographic data on what you like, who you talked to, what kind of post you are talking about, and whether or not you went to pages that are linked to Facebook, extracting those interests and calculating out what is the next best thing to show you and how you are going to react to this type of ad versus this other ad. A lot of Learning Analytics technologies already exist so that it should not be a huge jump to apply them. Educational institutions are slow to adopt LA, while the reality is that other sectors are already engaged in analytics. Certainly, there is need for robust and intensive systems, but as management of the data, algorithms, and processes that are directly related to education and learning analytics improves, development along those lines will occur as well. The Learning Analytics market will grow bigger as more people enter in for retraining such as those people, for whom traditional education just did not work for them, who ended up in a position that did not meet quite the same academic level. These Learning Analytics systems should be developed going forward. Given that marketing is already doing a lot of this, it should not be a huge jump to do this with Learning Analytics in 5 to 10 years.
Question 2 – Group Discussion
Dr. Vivekanandan Kumar: On how Learning Analytics would affect jobs, some believe that humans are irreplaceable while others believe Artificial Intelligence will replace humans in most contemporary ‘jobs’. I believe that Artificial Intelligence can replace people and mimic people. Such tendencies have been proven in the world of games. It is not long before it comes to reality in other domains. Artificial Intelligence can supplement the work of physicians. Patients, for example, can talk to a virtual doctor for the initial contact and then go to the expert only for that expertise. In a way, healthcare expertise can be split to a machine-implemented consult and expertise yet incomprehensible to machines. People can resort to the low-hanging, non-human consult solutions. This is happening already in the medical field. In business environments, business intelligence appears mature enough to replace human managers. What kind of a stretch would that be if Learning Analytics achieves enough maturity to replace part of a teacher’s expertise? It is already happening. Students consult these tools first before talking to their tutors.
Clayton Clemens: There is a similar trend happening in business with personal productivity tools. A lot of companies who create and maintain personal project management and productivity tools have goals to eventually tailor their software to be smart enough to take basically all the coordination work out of projects. All the logistical stuff, like setting meetings and even deciding on tasks for people for a day, will be assisted by AI so that the workers can strictly be focused on their work. It is easy to make that shift in learning analytics, even if we are not dealing with super intelligent tutoring systems. It could just be systems that are smart enough to handle logistics. That would be huge gain in efficiency both in terms of administrative support from an institutional lens, but also just in maximizing time the student spends doing the learning and doing the academic work.
Clayton Clemens: Is there enough attitude amongst educators in general about adopting this technology since what is in play is the automation of the teacher’s job to some extent, so will educators be okay with that?
Joel Burrows: Educators are probably not going to have a choice. The point is that entire careers will suddenly disappear, and a huge number of people will go back to school. These schools will just be overwhelmed by the number of students that will be coming in, and they will have to rethink how they do re-education. The traditional method with teachers will not work because there will be so many students.
David Boulanger: However, ethical issues will pop up and may slow down the arrival of those stresses.
Steve Harris: A lot of this depends on cost pressure as well. There is pressure on budgets. Academic institutions are not very different from businesses in that sense. As academic institutions will want to increase the number of courses, they will be a little hesitant to increase the number of full-time instructors. As automation evolves, if they can introduce some level of automation to help them spread out their hiring, that will be a draw as well for administration.
Joel Burrows: At the same time, a lot of people will still want to have and need that human touch in education. In healthcare, for example, it is foreseeable that doctors will be replaced by AI before nurses because someone who is sick in the medical system may still want that human touch and a nurse can provide that better than a doctor.
Steve Harris: On the other side, it would be possible to end up with a two-level help desk system where the first level is automated and tries to address as much as possible Level 1 issues, and then issues get escalated to an instructor. So, in that case an instructor or a tutor can handle a larger number of students because there is an automated system that can flag those issues that are the most important that need that person-to-person connection. As for those students that do not need or are not as interested in that connection, they will be left alone to go through the course material.
Joel Burrows: It is like optimizing the learning environment. Some people need that human touch and thrive on it and other people can probably get along fine without it, but part of learning analytics is learning those models and applying them to students to optimize their environment.
Dr. Vivekanandan Kumar: As an educator, I would not mind being replaced first before my teaching assistants get replaced on the assumption that students feel that they are closer to teaching assistants than to teachers. On another note, there are rudimentary tools that are in place that measure how well an educator performs. For example, administrators can now see the number of times teachers have interacted with their students and use that as a measure of student or course satisfaction. Such measures are not quite reliable mostly because of the way they connect numbers with performance. If someone conducts a study and shows that those numbers make sense, then it will be accepted, but right now, people are using these technologies in an adhoc fashion, which at least should not be accepted.
Steve Harris: Some colleges calculate how long it takes a teacher to respond to students. Teachers get little reports on those little metrics. These metrics do not actually act on anything. Supposedly, they are just there to be helpful.
Rebecca Guillot: To wrap up on whether technology could replace humans, it is true that technology is becoming more and more efficient and that many tasks can now be done by technology. The only point is that students need to feel that their teacher understands them and guides them in what they must learn, in what they must go through. This is the main responsibility of teachers, to put themselves in the place of students and to take students where they are and make them progress. So, if the technology is advanced at the point that students can have this feeling, then learning analytics would be perfectly fine to replace the human.
David Boulanger: It depends also on the school level that is targeted. Higher education is one thing, and the elementary school is quite another. For kids, replacing the human is just not the correct solution. However, as the child grows up, the student will be brought to get more accustomed with the technology, and the technology will play a greater role over time.
Dr. Vivekanandan Kumar: If someone says that Artificial Intelligence is directly impacting the development of human brain, and if that is true, then one can expect Artificial Intelligence and human brains to co-evolve, be co-dependent and be companions! Such co-operative brains that may achieve better heights of processing than individually by either Artificial Intelligence or the raw human brain. Who knows?
What are the challenges (technical, theoretical, and practical) faced by the field of Learning Analytics as it approaches higher education reforms?
Question 3 – Individual Answers
Steve Harris: One of the biggest challenges, which has already been alluded, is privacy. There is a lot of very intimate information that can get collected as part of learning analytics, a lot of data that can be very sensitive, and part of the reason that similar analytics are implemented in businesses, marketing, banking, or finance more quickly than education is just because traditionally academic institutions have very strict ethical standards and they are very careful about private information. Hence, it will be a real challenge with any institution to engage in learning analytics. Anyone who have gone through the ethics approval process to collect relatively straightforward pieces of information can attest that it is certainly not going to get easier as more sophisticated data are collected.
Another challenge is the Internet access and the fact that Learning Analytics is mainly accessed through the Internet and high-speed systems, which may constitute a barrier for distributed education and online education. Data security and platform capacities are going to be an issue as well.
One of the biggest challenges is determining who owns the intellectual property that makes all these things run. Educational institutions are like businesses. They try to attract people, while there are companies who would pay quite a lot of money to have this level of data on people, getting in privacy as well. However, even just owning the intellectual property to make all this run will put academic institutions in strangleholds of software platforms and technologies that they need to run to maintain their systems. It is a little like genetically modified seeds. Once you get your seeds, then you plant them. If they are not fertile, then you must keep going back and buying seeds from the same guy no matter how much he/she raises the price. There may be a bit of a challenge there. Finally, regarding who will be paying for all these technologies, because these technologies are very expensive, institutions do not have a ton of money kicking around. For instance, a certain university had a huge commotion because it had rooms in the geography department that were sponsored by major forestry companies. Many students were terribly upset because some of the university’s operations were being controlled by these corporations. That is a real challenge because at that point, is there an influence on the course material or not when there is this level of financial support that needs to be put in for those resources?
Clayton Clemens: Educational institutions are unfortunately not multi-billion-dollar enterprises, and as such they will probably lack some of the big technology infrastructure to perform some of learning analytics’ goals and requirements both in terms of hardware and software. Educational institutions will not be able to have gigantic server farms to hold and process all these data in real time like some of the major big data companies were able to. Similarly, for software, educational institutions’ learning analytics researchers and developers are mostly constrained to use open-source technologies, which are fine. They are functional, but they do not really have the shine and polish, the marketability, and sophistication that a lot of enterprise-level software might have. Moreover, in cases where proprietary technologies are required, higher education institutions will not necessarily be able to afford these solutions.
A second challenge is adoption. In the domain of business analytics, for example, there are a lot of business intelligence built out there, but there are tremendous challenges just to get people to adopt it. A cultural shift is needed for good adoption and implementation of learning analytics. There must be an environment that goes into the management of processes and expectations of data-driven decision making, and for that adoption from prominent stakeholders is needed as well as confidence in the Learning Analytics insights presented. Learning analytics tools must represent a source of truth.
Joel Burrows: There are three main challenges. The first one is privacy. Learning Analytics depends on data, but privacy may hinder the availability of data. Privacy can be weaponized by those fearful of LA. Facebook is dramatically changing people’s attitudes towards privacy, and people may be extremely hesitant to give up any private information. This may be even a larger problem in elementary and high school where parents could be extremely protective.
The second challenge is acceptance. Not everyone is willing to accept change. Many people still refuse to believe in the power of analytics. People still refuse to accept that machine learning is overtaking the world, despite that it powers many things that they use (i.e., Siri). Educators can be very set in their way and may resist Learning Analytics applications. Many educators can be very tech-phobic. Those of us who work in technology sometimes forget that a lot of people struggle with the basics of computers and technologies.
The third big challenge is related to the fact that Learning Analytics applications require people with machine learning and AI skills. These skills are in incredibly high demand. Educational institutions do not have the financial resources of the big tech companies and may struggle to find and retain people to work on Learning Analytics applications. They may have to rely on students to implement applications, or somehow convince companies such as Google, Apple, Amazon, or Facebook to get involved.
David Boulanger: From a technical perspective, the fact that most machine learning applications must operate in homogeneous environments combined with the reality that Learning Analytics is highly domain-dependent constitute a big challenge. Machine learning requires a significant and sufficient number of data samples and in most cases requires that the same data types be collected over time, necessitating to start over again if new data types become available, generating the problem of siloed learning environments. To generate the data that a machine learning solution needs, everything must be controlled. This results in some type of paradox. Our team recently wrote a paper about automated essay scoring systems, which concluded that it may take up to between 1,000 and 2,000 hand-graded essays for a teacher to implement one writing construct or a writing exercise within a learning analytics system. In some way, that number is huge for a single teacher to give him/her the opportunity to customize his/her teaching and create adapted writing activities for his/her students. Thus, to allow for individuality or individualization in the context of AES, a too heavy workload is currently imposed on the teacher and requires reducing the number of hand-graded essays to automate the scoring of essays. In the current state of things, automated essay scoring systems must be implemented over part of or the whole curriculum, requiring collaboration between several teachers, because a single teacher cannot provide enough hand-graded compositions per writing construct, consequently limiting the capacity to personalize the teaching and learning processes. The paradox is that personalization and adaptivity must be fed with too much human expertise because machine learning operates in homogeneous and controlled environments. Deep learning explores ways to address this issue through transfer learning and recurrent neural networks (RNN), but it is in its infancy and is limited in its scope of application such as image classification. Similarly, learning analytics is about finding cause-effect relationships among learning variables. Given the uniqueness of every learner, observational studies require a huge number of participants. Observing students in the wild requires a larger number of participants, while on the other side to deal with more reasonable sample sizes, the environment in which students are observed must be homogenized, limiting the extent to which personalization and adaptivity may occur.
Dr. Vivekanandan Kumar: The first challenge is, if developing a decent Learning Analytics system that would accurately measure students’ competences is feasible, that there is a potential for the exposure of students’ lack of competences. Meaning, students would not be able to hide what they lack if their study portfolios are required by the employer. That is going to be a challenge. Will students want that since not everyone is going to be in the top ten percent? What about the other 90 percent? Will they accept to showcase exactly what they know and what they don’t? Learning Analytics enables access to specialised tools that students can use outside their classroom and self-reliant about their studies. Again, such interactions can be tracked, and students may be asked to reveal their study portfolios. This will be a sizeable challenge for any learning analytics proponent.
The second one is cognification. This is just like the electrification that happened about one hundred years ago, when everything manual was being electrified that led to the need for electric grids. People worldwide adopted the grid model and started to rely on tools that would take care of the otherwise manual processes. Analytics involves cognification, where intelligence and insights are generated when day-to-day data observed from life experiences are shared with the ‘cognification grid. The cognification grid will have ubiquitous access pipes to individuals. Analytics solutions will be available through the cognification pipes for learning-related problems that will tentatively be solved on demand. This is a good thing for the future, but cost-wise, logistic-wise, acceptance-wise, belief-wise, how is it going to happen? That is a rather huge challenge.
Third, Learning Analytics is expected increase the motivation to learn and consequently more people will succeed in learning, hence enabling a sizeable society to become a pro-learning society. Sustaining such a motivation is a bigger social/practical challenge.
Fourth, Artificial Intelligence techniques used in Learning Analytics can become too large to be handled by humans! Artificial Intelligence will start to collect and assemble personal data as it becomes more autonomous. What about ethics for systems that embed Artificial Intelligence? There is a compelling need for human-oriented control of Data Analytics systems.
Finally, Learning Analytics will instantiate a cloud of theories with datasets arriving from various cognification pipes. Managing and validating such a cloud is a challenge.
Rebecca Guillot: There are various levels of challenge. The first one is collecting data through sensors. Of course, for learning analytics researchers, the more data they have, the happier they are. It is like a gold mine. The main challenge is to collect data, through sensors, without harming the natural development of students. Sensors can be a nuisance for some daily tasks. Therefore, a good balance must be achieved between capturing enough data for the analyses to be relevant, but not too much data in order not to interfere with the learning process and harm the students.
Another challenge is to synchronize data collection from the various sensors so that the Learning Analytics ecosystem can have a very precise view of the student at a specific point in time. For example, Learning Analytics can track physiological data such as heartbeat, brain activity, eye gaze, emotion, stress, fatigue, etc. as students work through writing activities. The writing process will therefore be accompanied of contextual data, which must be mapped with every step in the writing process.
Interpreting those data correctly constitutes another challenge. Humans perceive things so easily so that the computer must be trained for months before getting right. Emotions, for example, can be captured to a certain level, but it will never reach the level of detection of a human. Teachers can also quickly detect the symptoms of brain fatigue, while computers might probably not detect it on time. Those small gestures of the body language, that humans can even detect unconsciously, must be taught to computers through voluminous and costly datasets and complex machine learning techniques.
Jeremie Seanosky: First, learning analytics needs to be embedded and integrated with existing curricula or learning environments because right now there are a lot of disparate non-connected systems that are customized for specific topics or institutions. There is no large high-tech corporation like Google in learning analytics that tracks the whole profile of learners throughout their learning journey.
This leads to the second challenge, which consists in making a centralized learning analytics ecosystem that follows students throughout their learning experiences from K to 12 and up to higher education in all the subject matters like math, science, English, history, etc. and that will discover and assess their skills and affinities and help them improve their learning experience.
Lastly, that would be to establish the proper ways to analyze students’ learning traces and provide the proper feedback in a timely manner that is consistent with their overall learning profile. Because with all the data that can be captured in learning analytics, it is very easy to overload the students with all kinds of data and interpretations that are not necessarily meaningful and helpful to the immediate improvement of the student’s skills.
Question 3 – Group Discussion
David Boulanger: As a complement to the discussion on the challenge of scaling learning analytics or machine learning solutions, scalability can be achieved along several dimensions such as the learning domains (e.g., coding, physics, math, English writing, music), courses or level of expertise, teachers, institutions, and so on. Scaling learning analytics will require standardization because all the stakeholders involved in one of the dimensions will have to agree upon many things. Hence, standardization will also be a key challenge in the future.
Steve Harris: This standardization will probably be owned as an intellectual property of some sort because it will be expensive to develop and bring in organizations. So, that adds to the challenge as well. It would be ideal if it could be an open-source solution, but history suggests that it may not be.
Dr. Vivekanandan Kumar: A key challenge is about the growth of Artificial Intelligence itself. Artificial intelligence is the engine behind any analytics. Recent media outlets’ postings reported that Artificial Intelligence is seen as an evil thing that takes over human beings and grows beyond human capacity and comprehension. People do not like to see Artificial Intelligence grow that way at all. How long will it take for these people to change their attitude and finally accept Artificial Intelligence’s influence on how human beings learn? That kind of societal-level questions are going to be difficult to negotiate as Artificial Intelligence becomes more and more sophisticated.
Steve Harris: Most of the time people’s fear (even in some cases enthusiasm) is just a lack of knowledge about how AI works and what it does, making it difficult for people to be willing to undertake analytics projects from the beginning. Recently, I had a discussion with a manager involved in the decision-making process of a company. That person clearly had no idea on how artificial intelligence systems work. So, the manager tried to tell anybody who would listen that all was needed to do was to dump all the company’s data into IBM’s Watson and that it would tell them what they needed to do. I said, “What do you mean by ‘what we need to do’?” The manager said, “Well, it will give us the instructions, it will cluster everything, it will tell us who we should be marketing to.” The manager had no idea what the outcome would be, but just that Watson was going to tell them what to do. It was kind of a funny conversation. No matter what I said, the manager could not stir off this concept, saying, “No, no, IBM has figured it out. You just put your data in and it will tell you what to do.” The manager was animate because s/he read a Forbes article on it, and s/he interpreted it a certain way. That alone becomes a bit of a challenge, that is, having a more shared understanding of how all this works.
Joel Burrows: There is the opposite extreme where people just do not believe that it could work at all, like self-driving cars. Despite all the media attention, there will be a lot of people that will not believe self-driving cars are going to work until they drive in the way, and even then, they will probably be looking for the person hidden steering the vehicle. Some people are just not willing to believe it. Another example is the application of machine learning in music education. Recently, I was discussing with a piano teacher that it was now possible for computers to assess the progress that a student makes, that is, telling how much a student is improving on a piece of music. I kept explaining this to that piano teacher, and he/she refused to believe that it could exist. I said, “No, no, it’s there, I actually wrote…,” but she refused to believe that it was possible.
Dr. Vivekanandan Kumar: Not long ago, I had a conversation with a researcher on whether Artificial Intelligence could replace government and take over politicians. The point was that there are researchers who already know Artificial Intelligence and what it can do, and they truly want to see that transition happen soon saying that some ‘low-hanging’ aspects of policy making can be transferred to Artificially Intelligent systems.
Clayton Clemens: People tend to go to both extremes in a lot of cases, either AI is the champion, or it is Skynet. AI will probably do a better job in politics than a lot of people who are in politics. Again, it is more about the subtleties. When you must answer questions in life, nothing is to either one extreme or the other like yes/no answers. There is usually always nuances to it.
What are some of the key research questions you would like learning analytics to solve in the near future?
Question 4 – Individual Answers
Steve Harris: Probably one of the biggest things learning analytics should start to address is around learning disabilities, not only in post-secondary institutions, but also in elementary and high schools. I do a fair amount of volunteer work with kids and it is fairly regular occurrence where I get parents coming up to me and almost apologizing in advance for the way that their kids are going to act because they have got mental problems, they just do not do well in school, claiming that it is because their kids have not been challenged or that they are finding it very challenging and difficult. Not always, but a number of these kids when you start working with them, engaging with them, and tailoring a learning situation so that it speaks to the way that they learn, you see the change is dramatic and it is very quick. I have had parents just be amazed that their kids are engaging, learning, and doing well. At the college level too, it is very difficult to meet the requirements of students with special needs with the current environments and tools. As discussed previously, having this learning analytics profile would make it possible to start looking at how these special programs could be developed and customized even more to the learners and gain more knowledge around how these programs and contents could be successfully created for learners with learning disabilities. This is an area in the short term that has a lot of opportunity.
David Boulanger: It would be rather a block of research questions. The objective would be to conduct a big longitudinal observational study, in which causal directed-acyclic graphs would be developed to capture all these immutable (e.g., gender) and manipulable (e.g., self-regulated learning) variables in the learning and teaching ecosystem and where the impact of all these variables over academic performance (measured by various summative and formative metrics, including competence-based assessment) would be estimated.
A second research question would be estimating the short-term and long-term impact of learning strategies as the writing or coding processes of students are tracked and analyzed. For instance, if a student is writing an essay and uses a novel strategy versus other students, is that strategy benefitting the student only for the learning task at hand or is it impacting his/her best long-term interest?
Joel Burrows: Learning Analytics needs big and flashy successes to announce itself to the world. There are several opportunities for these. One good example is, learning analytics should be able to make traditional information-dump courses obsolete and completely change how they work. LA-based applications, given a document of text (and maybe pictures), could automatically create an education program that transfers the information to the student and try a variety of means of information transfer (e.g., text/audio) and figure out how the student learns, ideally by constantly quizzing the student on the information already covered. Such tools could optimize the knowledge transfer and give students confidence that they have mastered the information. Once Learning Analytics learns more about the student, it could even start predicting how much more time the student needs to master the material (something very useful to a student getting ready for an exam). All of this is doable and would be far superior to the current approach of having the students read a textbook, try to memorize it, go to a classroom, watch the instructor reading the prepared slides, and then take an exam to test how much they have memorized.
The other thing is that it would be nice to see the grand vision of what learning analytics is doing. Lots of papers attempt to define Learning Analytics and there is obviously a lot of effort being put into distinguishing Learning Analytics from associated terms (see Question 1), but very few papers describe the kind of future that Learning Analytics can provide such as the “The Diamond Age” novel written by Neal Stephenson (1998). People’s imagination needs to be caught about what learning analytics can be, but that is not something being done. Learning Analytics is a Copernican revolution in education. Students are still expected to conform to the course. They schedule their lives around classes, assignments, and exams. Learning Analytics instead allows the course to conform to the student. If a learner has a way of learning things that is best, the idea of learning analytics is that it would optimize the course, so that it delivers the material in a way that best suits the student. That would completely change education. More of those big-vision papers that explain what learning analytics could bring to education are needed.
Finally, it will be critical to investigate how to get educators to accept learning analytics. Without them accepting LA, progress will not be made.
Dr. Vivekanandan Kumar: My research question is about the notion of theorization, the ultimate ability of a human brain. Machines can create data as people do. People convert data to information and machines can do the same. People convert information to knowledge and machines can do the same. A piece of knowledge can be converted to intelligence by humans and machines are also approaching that frontier. Finally, from intelligence, people abstract and theorize, to generalize viewpoints that seem to percolate and apply to multiple situations and into multiple solutions. Machines have not done that yet. Learning analytics could become advanced to such an extent that it could theorize about how individual humans learn, how groups of learners learn, and how society learns.
Jeremie Seanosky: My main struggle with learning analytics is the idea of competence assessment. This seems to be a key focus of learning analytics with the idea of replacing the teachers, estimating the skill levels of students, and assigning grades. In multiple-choice quizzes, that is not really a problem, but in open-ended learning activities (e.g., essay writing, coding, or math), there are so many paths to the correct solution that the Learning Analytics system will not be able to recognize all of them. Thus, dogmatic feedback in terms of grades or competence assessments would only be counterproductive if the tools are not 100% reliable and especially if the educational stakeholders do not have full confidence in them. So, Learning Analytics should mainly focus on providing students and teachers with factual feedback on the students’ learning processes. For instance, based on my personal experiences, I would not trust a system that tells me that I am good at 76% in coding. I would rather have a system that tells me the problems it identified in my code, what I should have written instead, and let me decide for myself whether the feedback is relevant or not. I could have had a reason for having coded it this way instead of that way.
Rebecca Guillot: The goal of learning analytics is to help the learner learn more efficiently. Although studies have already be done on assessing the impact of learning analytics on academic performance, it should be investigated further for various levels of education, in several subject matters, with both formative and summative metrics, etc. Other research questions would include: 1) the impact that insufficient or missing data would have on the quality of the insights generated by Learning Analytics systems, 2) how much is learning analytics able to assist junior teachers or even worst compensate for incompetence ones, 3) what is the level of trust that teachers and students have in the insights generated by Learning Analytics systems, 4) what are the factors that could neutralize the positive effects of LA, and 5) can Learning Analytics undermine the trust that students have in their teachers?
Clayton Clemens: From a business analytics perspective, the application of learning analytics in the continuum of the Gardner model of analytics maturity in organizations is of interest. So, for learning analytics it implies going from descriptive analytics in regards to what happened, diagnostic analytics in terms of why it happened, and predictive analytics in terms of what could happen under various circumstances or if we do this thing that we have past trending about, what will happen, to prescriptive analytics, which is the climax in the dream where the system can observe a certain kind of learner and prescribe what will work best for that learner.
Another one is estimating the effect of personal productivity tools on learners’ performance. There is a plethora of digital applications that people can now use to track their tasks and projects and break things down in terms of how they want to accomplish and do things. What is their impact in academic situations? If students had access to some of those tools when doing their degree and had a better idea on how to use them, would they complete things more efficiently?
The connection between teaching analytics, as the other side of the coin for learning analytics, and learning analytics should be explored further, because Learning Analytics generally looks at learning in isolation, almost regardless of the specific course structure and teaching style. The problems get more complex when we look at both as they interact together. By having the two sides of the coin seeing each other as a collective moving target, it would be fascinating to observe how learners’ experiences change by including variables from both teaching and learning analytics. Finally, evaluating whether learner clustering and communities of practice, where information about how different types of learners learn is available, end up in a more efficient system of education overall would also be worth to be investigated in depth.