This is the script of my short segment for the “mythbusters” episode of Meet the Education Researcher. A brief rant about “digital natives” as a zombie idea in education.
Carlo Perrotta @carloper
Photo by Roksolana Zasiadko on Unsplash
The “digital natives” idea is a stubborn label that does not want to die – like many other zombie ideas in education such as learning styles or, if you are more into the policy side of things, school choice.
Marc Prensky, a public speaker and a “thinker”, wrote an article in 2001 introducing the distinction between digital natives and digital immigrants. In the original article, digital natives were those born in the mid-1980s and the 1990s who cannot remember a time when digital technology was not such a pervasive presence in society. Digital Immigrants, by contrast, are “not born into the digital world but have, at some later point in their lives, become fascinated by and adopted many or most aspects of the new technology”. The key difference between the two demographics is that while digital natives are “native speakers of the digital language of computers, video games and the Internet”, older folks will always retain a pre-digital “accent” (a “foot in the past”).
Prensky goes even further: for digital natives, a constant immersion in technology from birth has had distinct neurological influences, in much the same way as learning a language from birth influences cognitive and neurological structures. The implication for education is that new generations learn better in technology-rich environments, they are very good at multitasking, they are visual learners, and so forth.
The evidence from educational research tells a different story. It is true that young people have specific and “generational” ways of engaging with digital technologies, but the idea that immersion in a technology-rich world is somehow comparable to language acquisition and that their brains have developed or somehow mutated to accommodate a technology-rich world is of course, bullshit.
The reason why this idea has survived for so long is because it has, first and foremost, market appeal. It is a convenient shortcut to do some quick and dirty market segmentation. The bottom line is this: is somebody still talks about digital natives in 2019, they are probably trying to sell you something.
One myth than needs to be dispelled once and for all is that exposure to technology from birth causes a specific form of neuroplasticity: actual changes in the brain. This is a typical case of how empirical facts are misrepresented to fit particular narratives. Of course there is evidence about neuroplasticity, and it is true that the human brain changes over the lifecycle. However, these changes are associated with a) big traumas and b) broad lifestyle patterns.
The advice to educators is to be sceptical and circumspect.
The most important thing about technology use among young people is the need to avoid determinism, the idea that technology somehow determines aspects of personal and social life.
What research actually suggests is that pre-existing social and economic arrangements, complex and multidimensional factors like class, status, family background, where one lives (the global north vs. the global south) can explain better large scale differences in technology use and misuse among young people.