Digital technology use in schools is clearly not working, at least not in the way that we are led to believe from the state-of-the-art examples that are routinely used in marketing campaigns either by political parties or hardware and software manufacturers. In contrast to the state-of-the-art, the state-of-the-actual in most classrooms is very different. The digital education revolution promised has not arrived. The ‘wicked problem’ of educational technology use is not uniform. Technology use varies markedly dependent on geographic location, indigeneity and socio-economic background. The ‘messy’ reality facing teachers and school leaders is not helped by Pollyanna-ish discussions about ‘challenging but reasonable’ expectations developed by researchers continuing to peddle an unhelpful ‘digital native’ perspective. What is required is a thoughtful, mature and critical debate that recognises both the affordances of and limitations of digital technologies in educational contexts. As Mark Brown from Massey University suggests, “e-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly”.
The $2.4 billion Digital Education Revolution promised to put computers in the hands of all secondary school students and to “contribute a meaningful and sustained change to teaching and learning in Australian schools that will prepare students for further education, training and work in a digital world” (Australian National Audit Office, 2015). The billions of dollars invested in hardware and software along with teacher education continues, despite a lack of nuanced critical research that is considered by politicians, school leaders and most importantly teachers.
In October this year, the fourth ICT national assessment program (NAP) data will be released by ACARA. Unlike the NAPLAN tests which attract massive amounts of attention and debate, the NAP testing for ICT literacy remains remarkably unheralded; indeed, most academic and popular commentators are unaware of anything other than the literacy and numeracy testing that dominates attention in this country. The specific focus on literacy and numeracy testing has once again captured the attention of the nation with data released this week indicating that little significant improvement in students’ literacy and numeracy levels have occurred over the past seven years. It is likely that in-depth reviews into teachers’ professional development and curriculum changes coupled with debates about the place of standardised testing will follow the latest NAPLAN results; however, the data does reveal significant numbers of Australian students achieving the national minimum standard for both literacy and numeracy, typically above 90%.
Unlike the extraordinary individual and publicly available school-based reports that follow the NAPLAN testing, the NAP testing for ICT literacy is only made available in one public report which is routinely overlooked while “participating schools [only] receive a basic report about the performance of their students” (Australian Curriculum Assessment and Reporting Authority, 2013). Despite the dearth of individualised reporting that could enhance the ways teachers and students could use educational technology, the general results made available by the Australian Curriculum, Assessment and Reporting Authority (ACARA) do allow comparisons between the numbers of students achieving national minimum standards in NAPLAN and ICT literacy (ICT NAP) to be made.
Robert Randall, the head of ACARA has, this week, expressed concern about NAPLAN results that continued to illustrate more than 90% of Australia’s 3.6 million school students achieved national minimum standards in literacy and numeracy; however, there is no public comment or debate about previous ICT NAP results which indicate an increasing number of Year 10 Australian students are failing to meet the national minimum standard for ICT despite the billions of dollars invested in the Digital Education Revolution. Perhaps more startling is the fact that, since ICT NAP testing began a decade ago, no more than 66% of students at Year 6 or Year 10 across the country have achieved prescribed national minimum standards – a far cry from the 90% of students meeting similar standards for NAPLAN testing.
While this is a very concerning situation, the picture of educational technology in Australian schools use becomes increasingly grim when considered in light of additional data revealed by ICT NAP data which shows that, in 2011, there were Australian States which had less than 50% of students meeting minimum ICT expectations continuing a pattern evident since ICT NAP testing began in 2005. Considering Victoria as a more detailed case, the 2011 ICT NAP figures from students in Years 6 and 10 as indicative of the competencies of other Victorian students in primary and secondary schools it becomes evident the schooling system is failing 316,245 students.
It is important for us to maintain a clear understanding of the progress students are making in terms of literacy and numeracy but perhaps it is time to focus more clearly about what is going on, or rather what is not going on, in terms of the ways in which digital technologies are being used in Australian schools.
About the author
Dr Michael Phillips is a member of the LNM group and a lecturer in Teacher Education and Educational Technology in the Faculty of Education, Monash.
Australian Curriculum Assessment and Reporting Authority. (2013). Test results. National Assessment Program. Retrieved January 12, 2015, from http://www.nap.edu.au/results-and-reports/test-results.html
Australian National Audit Office. (2015). Digital Education Revolution program – National Secondary Schools Computer Fund. Canberra, Australia: Australian National Audit Office Retrieved from http://www.anao.gov.au/Publications/Audit-Reports/2010-2011/Digital-Education-Revolution-program—-National-Secondary-Schools-Computer-Fund/Audit-brochure.