6. Ethics

Fears and concerns

The collection and interpretation of data from a wide variety of sources understandably raises some concerns about the appropriateness of data collection and usage.

Some campus users will be sceptical of the value of some of the potential benefits and applications of the intelligent campus. They will quite reasonably be protective of their personal data, be conscious of security, and wary of misinterpretation. Even in an age where sharing of data on an app is commonplace, and scant attention is paid by users to the extent of this sharing, the fears and concerns of individuals should not be underestimated nor dismissed.

This can be considered in a number of key areas:

  • Awareness and control of one’s own data and its usage
  • Respecting individual privacy
  • Appropriate interpretation and decision making
  • Clear and transparent processes and policies

Using data and analytics is likely to involve the introduction of new devices and systems and changes to policies and processes. Different types of data may be collected involving individuals and groups and their activities, analysis will be undertaken, decisions reached and interventions made. As well as the positive benefits this may lead to, there is scope for misinterpretation and misuse leading to negative consequences. There is a responsibility on those designing and implementing applications within the intelligent campus to provide reassurance and effective management.

Questions that campus users may have include

  1. What data is being collected about me?
  2. Why is it being collected?
  3. What will the data be used for?
  4. How is it being interpreted?
  5. What actions will be taken as a result?
  6. Who will see the data?
  7. Can I control what data is collected and shared?

Specific concerns include the notion of being tracked, that the location of an individual is being monitored and the information used in some way to mount a form of surveillance or checking up on people. The original intention might have an appropriate justification, such as logging attendance or clocking in for work, but what other interpretations are being made – how many breaks you have, or how often you go to the toilet! Whether these are intended uses or not, the collection of the data raises concerns about potential usage and how the data might be used.

Jisc have produced a code of practice for learning analytics, which covers in some depth a number of the topics referred to below and is a useful reference for those wanting to explore further.

Personal data and privacy

Personal data is defined by the data protection act as data relating to a living individual who can be identified. This isn’t just identification from the data itself, but from other data or information that could be in the possession of the “data controller”. Individuals also have the right to be able to correct inaccurate personal data recorded about them.

Some of the examples of data used in intelligent campus activities might be thought of as not personal. However with the combination of different types of data from different sources, it becomes potentially easier to identify individuals, for example precise location and user behaviour. Anonymised data once aggregated can lead to better understanding of user behaviour and the management of facilities, but also potentially reduce privacy.

For example, a room is booked by a student society, the membership of that society is known, movement of anonymous individuals show a group congregating in that room, attendance records show who is present at lectures that day, and slowly a picture is built up of who is doing what and when. One key point is that access to data and the analysis of data should be limited to those who have a legitimate need to view them. This leads to a number of other policy and procedural issues that need to be addressed, some of which are covered in the following sections.

Responsibility

Universities generally have policies relating to the use of data but are they sufficient to cover the increasing complexity of different data types, sources and integration?

This includes the notion of responsibility, that can be considered in a number of areas across legal and ethical concerns. The different elements of collection, anonymisation, analysis and decision making need clear and specific responsibility assigned, as well as covering the objectives and intentions, interventions to be carried out, retention and stewardship of data. This could involve staff and services from different parts of the university, including IT, student services, legal and policy representatives. In addition, consultation of those potentially impacted by the practices should be undertaken at all stages of the design and implementation.

Transparency, consent and sharing

The objectives and processes involved in collecting and analysing data should be made clear to the individuals involved. Obtaining consent from individuals to use data is critical and three aspects of consent can be considered in the context of the intelligent campus and analytics:

  1. Gathering – how the data is collected or recorded
  2. Processing – concerning the interpretation of the data
  3. Actioning – making interventions on the basis of the decisions reached

Having appropriate policies and effective implementation in these three different aspects is important. This is to ensure individuals are fully aware of what data is being collected and used, and have made informed decisions on that usage.

As an example of these different levels, if anonymised data is collected generally about the movement of people, this might fall into consent for “gathering”. If the location of a specific individual is being collected and this is being used to determine behaviour, then this would require consent under the “processing” aspect. If the result of the interpretation is that contextual notifications or other information are sent to an individual, then consent would be expected under “actioning”.

Another aspect of sharing and consent relates to the current approaches to apps on devices such as smartphones. Users of these devices readily accept sharing requirements when they install apps and accept the terms. This may include the sharing of that data with third parties for example for advertising.

Typically this includes location but can also be your email address, contacts, search terms or even access to your camera. Specific examples include health apps collecting sensitive information about a person’s health, diet and activities and social media apps holding information on interactions and social groups. However, many apps also ask for permissions to access other data that can be seemingly unrelated to the purpose of the app.

Why are users seemingly relaxed about accepting various sharing conditions for apps but concerned about issues like surveillance when it is their university? One partial reason could be lack of awareness of exactly what is being shared and why. Another could be the perceived impact of such sharing. The university might be seen to play a more significant part in their life than a seemingly faceless company collecting information for more general use or advertising. For example, being at university can be a life defining period for students, with an impact on future career and social groups. Equally, for staff, being tracked or monitored by your employer might have perceived consequences for their career prospects, performance reviews or compliance with policies and procedures.

Universities could choose to add clauses to their terms and conditions, such that users accept these terms as part of wider acceptance of usage policies. However, users may tick the box to agree without really being aware of the implications, and having technically received their consent doesn’t alleviate fears and concerns or avoid problems arising later.

Educating users to be more aware of what data they are sharing, or to switch permissions off when not needed is important. Alongside this is transparency from apps and services in how they promote their facilities. Educating students to be more aware of security and privacy would be helpful regardless of whether the data is collected by the university itself or other parties. This could be considered as an aspect of digital literacy – the competencies needed to participate effectively in a digital knowledge society.

Interpretation and validity

There are reasonable concerns over the appropriateness of linking data together and drawing conclusions about related factors from this information. For example consider a future scenario – institutional knowledge about a student’s learning, attendance and progress, combined with details on their current finances held by them personally might suggest possible difficulties or anxieties and lead to suggestions about services such as counselling – even make available a live chat instantly.

This may be seen as a valuable intervention, but also relies on interpretation of data and reaching conclusions that may be flawed, not to mention access to sensitive data. Particular care needs to be taken in designing algorithms that make interpretations such that the decisions are free from bias or assumptions and are reliable and appropriate. As we move further into “intelligence”, and algorithms that can learn and adapt, we need to be aware of the potential of data-driven algorithms to learn our prejudices and lead to undesirable and even illegal outcomes such as discrimination.

Other examples include the use of facial recognition to assess emotions and link this to understanding or anxieties. However, reading facial expressions is complex, for example a frown could mean confusion or concentration, and we would need to be confident that such an application was based on reliable evidence of success.

Data collection and processing should be subject to the same measures of quality, validity and robustness that might be applied to research for example. This includes identifying inaccuracies, awareness of incomplete data, care with choice of data sources and appropriate correlations of data sets. Considerations of validity, usefulness and appropriateness would also apply to the algorithms and interventions.

Having rigorous processes across the three phases of gathering, processing and actioning, combined with careful consideration of the concerns of users will help to deliver benefits to users of the intelligent campus.