Daily Beirut

Culture & Society

Is Gen Z Really the 'Dumbest' Generation?

Evidence shows a decline in cognitive test scores and academic skills among Gen Z, linked to digital environments, but biology isn't the cause.

··10 min read
Is Gen Z Really the 'Dumbest' Generation?
Share

In January 2018, researcher Ole Rogeberg walked into his colleague Bernt Bratsberg's office at the Ragnar Frisch Centre for Economic Research in Oslo, Norway. He announced the data analysis was complete, and with bright eyes, Bratsberg understood the results were positive — confirming what they both expected. They were ready to deliver a new blow to the scientific community: evidence that the world's youth were moving toward what some call "stupidity."

The notion that there is a biological difference in brain structure between generations is fundamentally wrong. The time gap between one generation and the next is far too short for such clear biological changes to occur. However, context does leave its mark — everything from politics and economics to education and health can shape a generation's characteristics, leading some to note that a particular generation seems "different" in some way.

For example, some currently speak of Gen Z's higher capacity for independence, revolution, and challenging authority, attributing this to their connection to a digital society where traditional institutions hold less power. Yet, Millennials, when young, drove the Arab Spring revolutions. Many judgments about Gen Z today may stem not from a fundamental difference in the generation itself, but from observing them during their natural age of political and social rebellion — typically between late adolescence and mid-twenties — at a specific historical moment with more mature digital tools.

This skepticism is crucial when encountering any results about generational traits. In recent years, a recurring headline claims Gen Z — those born roughly between 1997 and 2012, who grew up with the internet, smartphones, and social media from childhood or early adolescence — is the least intelligent in history. Scientific evidence does not support such a simplistic claim, but it does point to something more precise worth examining.

Declining Scores in Core Skills

Since the 1990s and early 2000s, several countries have shown signs of declining results in cognitive ability tests, such as IQ tests, alongside a notable drop in academic achievement, particularly in reading, mathematics, and science. The most robust data comes from broad assessments by organizations like the OECD, which show that average performance in member countries fell in 2022 compared to 2018 by about 10 points in reading and 15 points in mathematics. This decline is significant by usual standards, equivalent to a learning delay of roughly three-quarters of a school year in some estimates.

In the United States, the NAEP assessment — known as the "Nation's Report Card" — paints a similar picture. Results for 12th graders in 2024 show continued declines in reading and math scores compared to earlier cycles. The average reading score was lower than in 2019, and also below older historical levels. Officials treat these results as alarming because they represent a deterioration in fundamental learning outcomes across an entire country, not just a single school or state.

The simplified meaning of these numbers is that a segment of today's teenagers, on average, has become weaker in reading that requires comprehension and inference, and less proficient in school mathematics that translates into problem-solving skills. This alone does not justify the blanket statement that "intelligence has declined," but it provides a strong scientific basis for saying that crucial cognitive and educational skills have clearly regressed.

The Flynn Effect and Its Reversal

This brings us to the "Flynn Effect," named after New Zealand-born American political scientist James R. Flynn. He observed that average IQ scores rose across generations when comparing those born in earlier decades to those born later, after adjusting the test standard. The phenomenon was first tracked through IQ test results of the American population over previous decades, then observed in other economically advanced countries, summarized as an increase of about 3-5 IQ points per decade. This effect was particularly pronounced in so-called "fluid" intelligence tests, which require learned reasoning to reach a logical conclusion from abstract information.

This rise was interpreted as a result of broad environmental improvements over decades, such as better education, health, nutrition, and working and living conditions. However, this phenomenon is not a permanent law. Since the late 20th century, several countries began showing signs of the curve "flattening" and then reversing — what some call the "Flynn Reversal."

Rogeberg and Bratsberg's 2018 study is among the strongest research works in this area. It relied on massive data from mandatory military conscription tests for hundreds of thousands of Norwegian men born between 1962 and 1991, providing a long time series and repeated measurements on large samples. The tests typically included aspects like arithmetic, vocabulary, and verbal similarities, combined into a score measuring "general abilities."

The study's strength lay not only in its sample size but also in its clever design using "within-family comparison." The researchers didn't just compare generations across society; they looked at siblings within the same family. If a younger brother, born years after his older sibling, showed the same overall trend — rising then falling — it meant the change couldn't be easily explained by factors like "less educated families had more children," "the population's genetic makeup changed," or "certain family types became more represented in the new generation." If the cause were purely genetic or demographic, the difference would appear between families, not within them.

Importantly, the researchers didn't just observe the rise, plateau, and subsequent fall in IQ scores. They attempted to explain it, finding that the pattern of increase until the 1970s, followed by decline in later generations, could be largely explained by improvements in environmental factors, education, and lifestyle after World War II. Another study in the journal Intelligence also showed that the mid-1990s was the turning point for the downward shift, where the ascent stopped, transitioned to stability, and then the curve began to fall. Similar results were obtained in studies on Danish conscripts, where researchers observed a rise in IQ scores over decades, then a later decline compared to the previous peak.

The Digital Classroom and Attention Deficit

As of this writing, the exact cause of this change is not definitively known, but "digital education" is a prominent suspect. If reading and math scores are declining in large-scale tests, and children and adolescents are spending more hours in front of screens, it's easy to connect the two. Testimony from neuroscientist and educator Jared Cooney Horvath before the U.S. Senate Committee on Commerce, Science, and Transportation on January 15, 2026, in a session dedicated to the impact of screen time on children, reinforced this link.

Horvath presented a clear argument: the massive expansion of screen use in schools did not produce an educational leap. Many promises of technology in education were measured by quick indicators like engagement and impression, not by long-term learning outcomes. This younger generation spent more hours in education, but their test results were not better than previous generations — they were worse, he said. The question about the role of screens in classrooms remains on the table.

Not surprisingly, the potential negative role of screens has been addressed by other studies discussing the impact of the "daily digital environment," such as excessive phone use, social media, and electronic games. Recent meta-analyses, summarizing dozens of studies, point to a statistically consistent negative correlation between the intensity of use of these tools and academic performance. The effect size is not dramatic, but it recurs across different samples, countries, and measures, making it a concerning general indicator.

A team of scientists believes these findings have plausible explanations, most related to a decrease in "attention" as a phenomenon accompanying the digital space. The digital world pulls attention for a simple reason: it is designed to do so. Colorful notifications, audible clicks, and apps fill the phone. The modern digital economy is built on providing small, repeated rewards to boost dopamine release with every swipe, notification, interaction, and short video. These are environments that reward jumping between stimuli.

Similarly, research evidence tends to confirm that screen use, especially in the evening or just before bed, is associated with delayed sleep, poor quality, and shorter duration in children and adolescents. Sleep is essential for memory consolidation, attention regulation, and mood. So, even if the screen's effect on intelligence is debated, its effect on sleep, and consequently on daily cognitive readiness, is more stable and established.

A third factor adds to the problem: the world of very short clips, posts, and tweets has affected Gen Z's ability to read long articles or watch long videos (like documentaries). This type of prolonged engagement with knowledge is necessary for developing critical thinking. When you receive information from a short video, you don't have enough time to interact with it by rejecting or accepting it. But when you read a book for a month, there's an opportunity to "process" it in your brain.

The bigger issue is that academic learning, especially aspects of comprehension and reasoning, requires "continuous focus time," not "attention pulses." The conflict between what the digital social reality forces upon us and what real learning requires is clear. The ability to sit with a long text or an extended problem without distraction has declined, and this applies especially to Gen Z because they grew up entirely in a digital world, unlike the "hybrid" Millennial generation. For them, internet connection is not a choice but a life, and the encyclopedia they know is Wikipedia, not the multi-volume Britannica.

Here, too, we have real scientific evidence. One of the most famous meta-analyses found an advantage for paper reading over digital reading in comprehension, especially for longer texts or reading conditions that allow for distraction (like a smartphone). This doesn't mean digital reading is doomed to fail, but it means the reading medium is not always neutral, and the shift to short, fast texts may gradually weaken the muscles of deep understanding.

The biggest problem is that digital world products, especially social media, deliberately feed this trend. Short clips and posts that provoke surprise, anger, fear, mockery, and controversy are favored by algorithms and spread faster than long, sober, in-depth explanations. These sites are structurally designed to feed this type of content and enhance associated emotions for commercial reasons.

What 'Intelligence' Really Means

When we talk about defining "intelligence," it usually refers to one of three things: IQ test scores (psychometric measures with limitations), academic achievement in national or international tests (specifically reading, math, and science), or functional cognitive skills like attention, memory, and problem-solving. Intelligence is a combination of all this. It is not just a number; it is the ability to focus, read and learn, build logic, exercise self-control, manage time, and solve problems. If the digital world can disrupt these functions by draining attention and time, raising noise, and scattering memory, then practically speaking, intelligence is declining. This can explain what research has found so far, particularly regarding Gen Z.

However, this does not mean other generations are immune to the problem; everyone is affected. The specificity of Gen Z is that the most important period for cognitive acquisition in their lives coincided with the digital world revolution. Understanding this is important to avoid the manufactured war over generational superiority. As stated, there is no "genetic" explanation that makes one generation significantly superior or inferior to its predecessors. Any substantial changes point the finger at the changing environment.

If the goal is to improve abilities, the practical path suggested by research evidence begins with a return to the basics of deep learning: restoring daily long-form reading and writing that requires explanation, argument, and examples, instead of over-reliance on multiple-choice questions alone. This type of training builds understanding and reasoning and re-accustoms the brain to continuous attention rather than fragmented flows.

At the school and home level, what is needed is not a war on technology, but smart regulation. This means reducing screens in the classroom in favor of paper, discussion, and experiments, with targeted digital use (training a specific skill, scientific simulation, or formative assessment) rather than comprehensive digitization of everything. At home, studies point to the need to manage the phone environment for teenagers, especially at night, because sleep and attention are the infrastructure for memory and learning. Simple tasks like leaving the phone outside the bedroom, reducing notifications, keeping the phone away during meals, exercise, or daily family time, and setting consistent sleep hours often make a bigger difference than we imagine in the ability to focus and learn.

Share

Related articles