The average adult’s attention span is shorter than a goldfish’s — just eight seconds — and it’s only getting shorter.
That’s what everyone seems to say, anyways.
But in today’s stimulus-rich environment with a constant bombardment of information, perhaps the truth about our attention spans is more complex than we’ve been led to believe.
The goldfish anecdote is simple yet effective. It suggests human attention spans have become as minimal — if not less — than that of a much smaller-minded animal. The unspoken assumption is that it’s at the hands of our devices.
However, this is more of a blanket statement than the universal truth it’s often portrayed as.
And it’s not even true.
The shocking statistic comparing adults to goldfish was picked up by reputable sources, including Time Magazine, The Independent, and even The New York Times in 2015. These sources all credited the statistic to a supposed Microsoft Advertising survey, which reportedly surveyed 1,200 Canadians about their internet usage.
In reality, this was not a Microsoft study. Rather, a member of Microsoft’s advertising team found it on a website titled Statistics Brain — now called Statistic Brain — which was impersonating an academic institution.
Despite its false origins, the belief stuck for a reason: it seemed believable.
In 2022, the Policy Institute and Center for Attention Studies at King’s College London conducted a survey of the U.K. public and found 50 percent of respondents “wrongly believed the average attention span among adults today is just eight seconds long.”
Half also believed that humans have a worse attention span than goldfish, demonstrating the continuous infiltration of the false 2015 statistic.
It makes sense that our attention spans could be so miniscule, especially after observing our own behaviors and attentional tendencies.
Professor Shaun Vecera, a professor of psychological and brain sciences at the University of Iowa, suggested that our attention spans aren’t necessarily getting shorter; rather, we’re experiencing more distractions than usual.
“We’ve created environments where there are more alerts, and pings, and other distractions that have made it harder for us to be able to apply attention in any sort of sustained way,” Vecera said.
These distractions also come with enticing, reward-based motivations that capture our attention. When it comes to phones and other devices, the implied social value associated with a notification activates the dopaminergic reward cycle, encouraging users to seek out and anticipate such distractions.
Even the mere presence of our phones — without an alert or notification — is enough to distract us. We’re aware of the potential for reward. Sometimes, it’s the thought of a distraction that does the distracting, something known as an endogenous interruption.
Another flaw in the goldfish anecdote is that it promotes the idea of a single, unchanging definition of the human attention span.
People require different levels of attention for different tasks. The attention needed to drive in heavy traffic is different from the attention required to carefully analyze themes and characters in a novel.
So, in some instances, short attention isn’t necessarily a bad thing. When driving in heavy traffic, our attention can’t linger on one detail for too long, or we risk missing another potentially important — or even dangerous — detail, such as a pedestrian darting into the street or a car trying to merge.
It’s important to acknowledge that attention is sensitive to the demands of the environment. Short attention isn’t always bad, and long attention isn’t always good. It’s a complex and ever-changing concept.
When it comes to staying focused on what’s most important, Vecera suggests it’s a matter of “taking back the control of your attention instead of letting the world determine what you’re attending to based on all of the notifications and alerts that you get.”