Do Digital Natives Really Exist?

If you ask the average baby boomer-aged internet user who they ask for help when their devices or systems fail, they will undoubtedly say they ask someone younger. Indeed, the millennial generation (those born after 1980)—the first to be called “digital natives” —is often credited with being naturally better at adapting to technology thanks to their early exposure to it. The term was coined in 2001, and since then has been used by everyone from marketers to so-called digital natives themselves.
But is the term digital native, like so much in the digital world, just driven by hype? New research suggests that this might be the case. An article in the journal Teaching and Teacher Education, which examined the issue from an educational lens, found that the idea that “digital natives” are naturally better at technology is inherently flawed. The study notes that “Current discussions about educational policy and practice are often embedded in a mind-set that considers students who were born in an age of omnipresent digital media to be fundamentally different from previous generations of students” but that “there is no such thing as a digital native who is information-skilled simply because (s)he has never known a world that was not digital.”
This finding upends a lot of our assumptions about how generational usage of technology works, and also has implications for various fields including corporate employment, education, and commerce. As Quartz wrote, “If the idea of ‘digital natives’ was just jargon that advertisers used to sell to the under-30 crowd, all this might not matter much. But the idea that digital native are fundamentally different is influencing everything from the way curriculum is designed to the way companies shape their corporate work environments.”
The authors of the study explain that changing educational approaches with the assumption that digital natives will know how to use them is a mistake. The better approach is to “treat people as human, cognitive learners and stop considering one specific group to have special powers,” the co-author of the study said.
Quartz went on to point out that the study points to how our assumptions about the innate abilities of digital natives could be harmful in a different way: “The Teaching and Teacher Education paper raises another concern. Digital natives are assumed to be able to multitask, it warns. But the evidence for this is also scant. Reading text messages during university lectures almost certainly comes at a cognitive cost. So too, employers might assume, does fiddling with smartphones and laptops in meetings. Buy that technologically innovative insurance policy another time.”
In other words, younger generations don’t have cognitive superpowers that allow them to spread their attention more widely without losing some amount of focus. The idea that human cognitive abilities have changed so drastically from one generation to the next should have been more scientifically dubious to begin with. It certainly helps marketers to think this way, but it may not help the so-called digital natives themselves.
Despite all this, we don’t need to go back to thinking that millennials are exactly the same as their parents. While it can be helpful to categorize generations in some ways—researchers point to areas like “workplace preferences, life goals, religious participation, alcohol and drug use, and trust in institutions” where millennials do seem to have distinct views—assuming they have superior learning skills might not be so helpful. This is especially true given that not everyone has had identical access to digital technology while growing up.
Hopefully this research will allow educators and policymakers to approach their changes with a more nuanced understanding of what this generation needs.