Use Case: Adaptive learning
What’s the issue?
With the increasing use of e-learning and blended learning, there is a growing tension between the potentially interactive physical classroom experience and the largely static, homogeneous content provided through systems such as virtual learning environments. In addition, increasing student numbers, with widening diversity, pose challenges for teaching staff attempting to provide a differentiated, contextualised and personalised learning experience to a large audience.
How can intelligent campus help?
A combination of mobile devices and classroom technology, learner data and institutional systems can integrate to understand learners’ needs and assist them to effective outcomes. A multitude of data about learners already exists, not just their activity within the VLE, but their location on campus (or outside), interactions with the library, demographic data from student information systems, their historical patterns of learning and even their ambitions or goals for the future.
Individuals’ mobile devices along with classroom technology could interact with this wealth of data available about the student and their learning to provide an enhanced experience. Personal devices and apps may also hold information on individual preferences and learning styles to enable differentiated interactions.
What could be done?
- Digital textbooks providing hints, explanations and practice questions, linked to the class content on the VLE. Monitoring common problems could feed into course design.
- Headsets supporting virtual or augmented reality views of content and places around the lab, campus or at sites of educational interest.
- Adaptive devices providing personalised interfaces (including accessible tools) to meet learning preferences, and by monitoring progress adding new or supplementary content to match the needs for individual pace and style.
- Virtual field trips (or augmented reality physical field trips) including location-awareness to give students prompts and contextual content, allowing personalised interaction with the environment.
- Science experiments conducted remotely through the internet, with devices connected to laboratory equipment to take measurements and set analysis tasks.
Other applications use real time natural language processing to spot intentions or emotions in conversations with peers or tutors, for example using chat-based apps on smartphones. Intelligent “conversational” agents can provide tailored feedback to students on performance or respond to common questions with personalised information understanding their context, including their geo-location on campus and proximity to learning resources and facilities.
Are there any current examples?
There are few examples in practice of IoT devices integrating into adaptive learning approaches. In most cases, where adaptive learning is being used, it relies solely on institutional systems and learner analytics. These could be adapted to add integration with mobile devices.
Bolton College is providing a personalised pathway through VLE content by differentiating students according the data about them such as how they performed in previous tutorials, leading to more stretch and challenge if they do well. The college have also implemented an online digital assistant that can respond to natural language questions, in some cases removing the need for the student to access other systems such as the VLE.
At Edinburgh University, a tool using algorithmic machine intelligence has been used and quizzes designed linking to learning pathways. Elsewhere they have piloted use of a twitterbot to answer simple questions for example about deadlines. Cluster analysis and segmentation has also been applied on a wide range of historical data to identify learning trajectories and map these to current students.
What about ethical and other issues?
Security of data is crucial, with these applications involving wide ranging data including personal circumstances of students and, potentially, financial data. For example a student may use a chatbot to query how much money they are still owing on their course.
Algorithms may be making decisions and providing feedback on behalf of tutors, including on sensitive topics. Even if teaching staff have been involved in programming the behaviours and responses, sometimes an automated system may not produce an output that a human would agree was “appropriate”. It is likely that careful moderation would be needed, at least until the technology was sufficiently mature.
Who needs to be involved?
Having teachers involved in the development alongside data and systems specialists allows for a more integrated approach utilising the technology within the pedagogic context. Skill sets including instructional designers and algorithm specialists may be needed. Having a robust policy at an institutional level will help manage the ethical issues more effectively.
Adam Field (who now works for Jisc) and some other people on my team worked on a web controlled experiment with a fancy servo that people could schedule experiments on and it’d send them data and a video. We never expanded on that pilot, but it would be interesting to see if you could make an API to plug experiments into with a fairly standard web front end. Speaking to teachers & techies there was concern that students need real hands-on experience and also some experiments require humans to reset them, and to ensure that safe values are entered etc.
That said, I think it’s an interesting idea for people remote learning and works for people learning in different timezones. Maybe it could work for MOOCs?