In lecture halls, classrooms, study areas and virtual learning environments something is haunting education… the ‘promise’ of data analytics.
As part of the increasing datafication of formal education systems in recent years there has been a burgeoning in the use of learning analytics to measure, gather, analyse and report data about learners and their contexts. This growing array of data sorting protocols utilised to visualise, personalise, predict, and intervene in student’s learning experiences, increasingly operate at personal, institutional and state policy levels. Consider just a few examples. Kno Me is a personal study dashboard that helps track student’s uses of eTextbooks providing real time statistics on time spent reading, notes added and flashcards mastered. Learning Catalytics, a web-based tool for managing the interactive classroom, obtains real-time responses to open ended questions, determines which areas need additional explanation and then groups students for further discussion. Utilising IBM technology Victoria’s Department of Education and Early Childhood Development recently used predictive analytics to estimate disengagement and dropout rates of 36,000 students.
While there is a long history of educational data analysis, these new systems represent a radical change in terms of proliferation, volume, speed and reach. They are commonly introduced with the declared aims of improving the student experience, bolstering academic performance, fostering well-being and reducing drop out / failure rates. Although some commentators have expressed concern about privacy issues and the possibility of misinterpreting data, such tools are widely supported and generally well regarded within educational communities. After all, it might appear vexatious to question the operation of technologies that aim to fundamentally improve learning and engagement. Yet, there is a ‘phantom objectivity’ that menacingly surrounds much of the discourse on high tech objects. Thus technology is treated as an abstract notion, somehow autonomous of its social context. It appears completely rational, devoid of good and evil, whilst its relation to people is obscured. As Leo Marx suggests (2010, 576) ‘[b]y consigning technologies to the realm of things, this well-established iconography distracts attention from the human – socio-economic and political – relations which largely determine who uses them and for what purpose’. This obfuscation discourages critical consideration not only of the circumstances that give rise to such technologies, but also the manner in which they (re)shape behaviour and its associated meanings. Echoing such critical notions, Selwyn (2014) argues that educational technology is political, insofar as its use is negotiated, value laden and open to distrust. Hence it is vital to recognise that learning analytics are created, marketed and utilised in political, economic and cultural contexts. Consequently, educational researchers should be wary of simply asking technical and procedural questions about how to better utilise these systems. Rather, it is necessary to consider deeper social impacts, whilst exposing the values that underlie the introduction and use of such software. Thus, it might be asked what are the ideological values underpinning learning analytics, how does their use change social behaviour, what is their impact on student / teacher self-perception, who benefits from their use and how are existing power relations reconfigured? While answers to these questions might serve to illustrate the positive potential of such technology, they will also highlight some of the more problematic issues, offering more nuanced insights that could exorcise phantom objectivity.
Yet, it is not sufficient to merely question the nature of the various impacts of learning analytics. Rather in exploring and expounding the underlying dynamics it is also useful to draw on social theory to facilitate a deeper understanding of the processes and practices involved. Indeed at the very heart of critical approaches to educational technology research should be the use of social-cultural theory. To this end, the French social historian Michel Foucault’s writings provide not just some helpful conceptual tools for considering the (mis)use of such software, but also for critically engaging with broader notions of educational technology. Although Foucault did not write a single text focusing on education and never directly addressed the issue of newly emerging digital technologies, since the mid-1970’s his work has provided a cornerstone for critical pedagogy. He eschewed general meta-level theories, instead suggesting individuals use his ideas freely and draw upon his books selectively as ‘little tool boxes’. Reflecting this advice, a number of Foucauldian concepts will be briefly introduced to highlight critical questions concerning the use of learning analytics that could be raised and to illustrate the analytical benefits of drawing from such theoretical insights.
Firstly, it is useful to note that Foucault was interested in a broader notion of ‘technology’ as activities, knowledge and modes of organisation. From this viewpoint one should not concentrate on the software application in considering the impact of analytics, but look at the wider practices and procedures that surround its use. Such an approach broadens the focus, encouraging questions about how learning analytics are situated with regard to existing practices of assessment and evaluation, the general dynamics of classroom management and staffing. Furthermore, Foucault’s notion of ‘technologies of the self’ raises the issue of how such devices (re)constitute students’ identities as they compliantly use dashboards to adjust their behaviour or seek to resist such pressures. Secondly, drawing upon Foucault’s work on discipline it could be asked how such tools seek to ‘train’ learners into particular ways of behaving, normalising the self-policing of student behaviour, while inducing a sense of prolonged visibility. Significantly, Foucault asserts that the norm functions as a ‘principle of coercion in teaching’, with practices such as tests combining surveillance with normalising judgements. While it might be important to consider how such discipline operates in practice, it is also salient to explore how it is resisted. Thirdly, Foucault’s notion of power/knowledge highlights that knowledge is always an exercise of power and power always a function of knowledge. Through learning analytics certain ‘ways of knowing’ become privileged. Questions arise concerning not merely the curriculum content but also the ‘sanctioned’ manner of its delivery and how this will impact on students as well as educators. It should also be noted that data processes such as measuring, sorting, analysing and reporting are never neutral or objective, but powerful forms of meaning-making. Consequently the manner in which data are constructed and utilised as forms of knowledge/power needs to be considered. Fourthly, Foucault’s conceptualisation of governmentality, highlighting the production of compliant populations, draws attention to how learning analytics could act as part of a broader strategy of neoliberal surveillance that seeks to shape peoples’ activities while increasingly injecting market principles into everyday life. Outside of formal education such practices are already in evidence in the ‘quantified self’ movement, which seeks to use technology in the self-tracking of biological, physical, behavioural or environmental information. Governmentality introduces an element of political economy into the analysis, emphasising that student behaviour is increasingly defined and regulated through marketisation. Thus, far from being mere objective tools, data analytics can be reinterpreted as central to neoliberal market operations in late modernity.
Inevitably there are other Foucauldian concepts that might equally inform a critical analysis of learning analytics, yet hopefully the point has now been adequately made and some useful trajectories for future research suggested. In conclusion, it should be noted that adopting such an approach is not a dystopian, ‘doomster’ strategy that denies the possible positive outcomes of learning analytics. Rather it is an attempt to avoid ‘phantom objectivity’, recognising not only the promise of such technologies but also the associated threats. After all, in John Stuart Mill’s words, those who know only their own side of the case know little of that.
About the author
Andrew Hope is an Associate Professor in sociology at the University of Adelaide. Originally based within the School of Education he is currently on secondment to the School of Social Sciences. His profile can be found at: http://www.adelaide.edu.au/directory/andrew.hope.
Marx, L. (2010) Technology: the emergence of a hazardous concept. Technology and Culture, 51(3): 561-577.
Selwyn, N. (2014) Distrusting educational technology: critical questions for changing times. London: Routledge.
Recent publications by the author using Foucauldian concepts to critically explore technology use in education:
Hope, A. (forthcoming) Biopower and school surveillance technologies 2.0. The British Journal of Sociology of Education.
Hope, A. (forthcoming) Foucault’s toolbox: critical insights for education and technology researchers. Learning, Media and Technology. Pre-print version available at http://www.tandfonline.com/doi/abs/10.1080/17439884.2014.953546#.VIDzjP6KCUk
Hope, A. (forthcoming) Schoolchildren, governmentality and national e-safety policy discourse. Discourse: studies in the cultural politics of education.36(4). Pre-print version available at http://www.tandfonline.com/doi/full/10.1080/01596306.2013.871237
Hope, A. (2013) Foucault, panopticism and school surveillance research. In M. Murphy (Ed.) Social Theory and Education Research. London: Routledge. 35-51.