Today we take for granted that diet and exercise are vitally important for our health and well-being. But we didn’t always think this way. Much of this awareness emerged in a remarkably short time during the middle of the last century.
In 1955, President Dwight Eisenhower suffered a heart attack after playing golf in Denver. This event shocked the nation. The president was just 64 years old and projected American strength and vitality. The surgeon general at the time said that hearing the news about the heart attack was like learning about the bombing of Pearl Harbor.
Instead of retreating into secrecy, the White House flew in Dr. Paul Dudley White, a leading cardiologist who helped found the American Heart Association. He set a standard for transparency. When he spoke to the press, he went beyond explaining the president’s condition and sought to educate the public about cardiac events more generally.
“Heart attacks became less mysterious and less frightening to millions of Americans that day,” explains a New England Journal of Medicine article, “and White gave them the message that they could take steps to reduce their risk.” The idea that diet played a large role in mortality soon entered the national consciousness.
Some 10 years later, Dr. Kenneth Cooper, a military doctor who conducted fitness research for NASA, published a book titled “Aerobics.” He promoted a novel argument: Cardiovascular exercise was critical for health. In an era when people increasingly had sedentary jobs and lived a car-based lifestyle in the suburbs, he emphasized the need to specifically put aside time to exercise as a key component of longevity.
This was a radical idea in a culture in which voluntary exercise was associated primarily with the Army or sports. “Aerobics” became a best seller, and millions of people began exercising. According to Dr. Cooper, when his book was first published, less than 24 percent of the adult population engaged in regular physical activity, and there were fewer than 100,000 joggers. Within 16 years, close to 60 percent of the population exercised, including 34 million joggers.
The larger point is that transformations in understanding can unfold quickly. Just decades after Eisenhower and Dr. Cooper, we got the food pyramid, the term “low fat,” the running craze and Jane Fonda videos. Americans would never think about food and exercise the same way again.
In our current moment we face a new crisis, one that affects our minds more than our bodies: the negative impact of digital technology on our ability to think.
Is it time for a new revolution?
When I published my book “Deep Work” 10 years ago, I argued that email and instant messages were degrading our ability to concentrate on hard mental tasks. I recommended putting aside long stretches of time for uninterrupted thinking and treating this cognitive activity like a skill that you can improve through practice. The term “deep work” quickly entered the vernacular, and I started to hear people and companies use it without even realizing its source.
But the problems I focused on in “Deep Work,” and in my writing since, have been getting steadily worse. In 2016 my main concern was helping people find enough free time for deep work. Today I think we’re rapidly losing the ability to think deeply at all, regardless of how much space we can find in our schedules for these efforts.
The data backs up this claim. Research from Gloria Mark, a professor of informatics at the University of California, Irvine, indicates that our attention spans are about one-third as long as they were in 2004, with the biggest drops happening around 2012. Long-running surveys reveal that the share of U.S. adults who struggle with basic reading or math has risen markedly over the past decade, while the percentage of 18-year-olds who report difficulty thinking and concentrating jumped in the same period. A Financial Times article about these findings proposed a shocking but relevant question: “Have humans passed peak brain power?”
Many of these declines in cognitive skills became notable starting in the mid-2010s, exactly the period when smartphones became ubiquitous and the digital attention economy exploded in size. An increasing amount of research implies that this timing is no coincidence. A meta-analysis released last fall showed that consuming short-form video content, as delivered by apps like TikTok and Instagram, is associated with poorer cognition and reduced attention, and the results of a clever experiment from 2023 found that the mere presence of participants’ smartphones in a room significantly reduced their ability to concentrate.
The growth of A.I. has brought new cognitive concerns. A study from January, based on surveys and interviews with more than 600 participants, revealed a “significant negative correlation between frequent A.I. tool usage and critical thinking abilities.” Another recent study, which tracked the brain activity of research subjects who were writing with the help of large language models, found that “brain connectivity systematically scaled down with the amount of external support.”
The loss of our ability to think is a big deal. Close to 40 percent of the U.S. gross domestic product comes from so-called knowledge and technology-intensive industries, from aerospace manufacturing to software development to financial and information services. Companies in these fields alchemize advanced human thought into revenue; as we weaken our brains, we also threaten to weaken our economy. It is notable that productivity growth in the private business sector stagnated during the 2010s, when technology became measurably more distracting.
A diminished ability to use our brains also has concerning personal effects. Thinking is what lets us make sense of information in a complicated world. As president, Abraham Lincoln used to regularly retreat to his cottage, on the grounds of the Soldiers’ Home in the heights above Washington, to find the solitude needed to think intensively about the decisions facing him as commander in chief. A contemporaneous letter from a Treasury employee visiting Lincoln at the cottage during these years describes finding the president “reposed in a broad chair, one leg hanging over its arm. He seemed to be in deep thought.”
Thinking is also an engine for making sense of our lives and cultivating our moral imaginations. In 1956, as the Montgomery bus boycott careened into national prominence, Martin Luther King Jr. clarified his life’s purpose through a long session of quiet reflection one memorable night at his kitchen table, when he remembers his thoughts finally forming into a clear directive: “Martin Luther, stand up for righteousness. Stand up for justice. Stand up for truth.”
In an era when technologies relentlessly disrupt our lives, it can seem that this cognition crisis is a fait accompli — a side effect of innovations that cannot be stopped. But do we really have to accept this steady loss of our thinking ability as inevitable? In a short time, we transformed the way we thought about health. I’ve come to believe that a similarly rapid revolution is possible in how we respond to our diminishing ability to think.
What would such a revolution look like? In the world of physical health, we now know we should largely avoid ultraprocessed snacks like Doritos and Oreos, which are Frankenfoods made by reconstituting stock ingredients like corn and soy with hyperpalatable ratios of salt, sugar and fat. Much of the digital content that ensnares our attention in the current moment is also ultraprocessed, in that it’s the result of vast databases of user-generated content that are sifted, broken down and recombined by algorithms into personalized streams designed to be irresistible. What is a TikTok video if not a digital Dorito?
We should consider taking as strong a stance against ultraprocessed content as we already do against ultraprocessed food. Which is to say: Most people should avoid these diversions most of the time. In the same way that you’re unlikely to eat Twinkies as a regular snack or still believe that Pop-Tarts provide a balanced breakfast, stop consuming ultraprocessed content. Don’t use TikTok. Don’t use Instagram. Don’t use X. Their sugar-high benefits aren’t worth the costs.
There was a time when such a suggestion would have been considered eccentric and unworkable. (I certainly received my share of pushback when I first suggested that social media wasn’t actually as important as people claimed.) But I think that just as how our understanding of diet changed, we’re ready to accept that the metaphorical nutritional value of scrolling through outrage-tinged posts and short-form videos is minimal.
Governments can assist efforts to improve digital nutrition. In a move reminiscent of the Food and Drug Administration’s banning trans fats, Australia recently enacted legislation banning social media use for kids under the age of 16. In both cases, regulators looked at the evidence and concluded that the potential harms (whether it’s heart attack risk or damaged mental health) far outweighed the benefits.
The United States should follow Australia’s lead in this regard. Will some kids find their way around any safeguards that are put in place? Of course; such evasion is already happening in Australia. But the larger message sent by such laws is important. They reframe social media as something that should be closely monitored, similar to such age-gated vices as alcohol and tobacco — substances we’ve learned to approach with caution.
To continue unfolding the physical health analogy, let’s consider exercise. The cognitive equivalent of aerobic activity is contemplation — the intentional focusing of your mind’s eye on a singular topic, with the goal of increased understanding. Just as the sedentary lifestyles that emerged in the mid-20th century degraded our bodies, our current lack of contemplation is degrading our brains.
What’s the equivalent of this cardio for our ailing brains? A good candidate is reading. Making sense of written text exercises our minds in important ways. We develop what the cognitive neuroscientist Maryanne Wolf calls “deep reading processes” that rewire and retrain neuronal regions in ways that increase the complexity and nuance of what we’re able to understand. “Deep reading is our species’ bridge to insight and novel thought,” she writes. Perhaps consuming a few dozen book pages a day should become the new 10,000 daily steps — a basic foundation of activity to maintain cognitive fitness.
Another way to exercise our brains is to reject the constant companion model of phone use, in which we keep smartphones on us at all times. This places us in an untenable mental environment in which bundles of neurons in our short-term motivational systems, trained through experience to expect a quick reward from looking at our phones, are constantly firing, creating an insistent urge to pick up the device. This makes any act of sustained contemplation a battle of willpower — a battle we all too often lose. In this way, having constant access to our phones becomes a serious impediment to cognitive exercise.
One solution to this constant companion problem: Spend more time with your phone out of easy reach. If it’s not nearby, it won’t be as likely to trigger your motivational neurons, helping clear your brain to focus on other activities with less distraction. Let’s boil this down to a simple rule: When you’re at home, keep your phone charging in your kitchen instead of in your pocket. If you need to check your messages or look something up, do it in the kitchen. If you’re waiting for a call, turn on the phone’s ringer. This strategy allows you to participate in activities such as eating meals, watching a shared show or talking with your family, without the distraction of constantly wanting to glance at a secondary screen.
Our institutions have a role to play here as well, as rules and regulations that reduce distraction in group settings can help support the strengthening of cognitive abilities. In the wake of the success of the N.Y.U. psychologist Jonathan Haidt’s 2024 book, “The Anxious Generation,” many school districts around the country began banning smartphones from classrooms. These efforts have proved to be exceptionally fruitful. A 2025 working paper from the National Bureau of Economic Research found that school phone bans were followed by “significant improvements” in student test scores; similarly, three-quarters of the 317 high schools surveyed by a Dutch research team reported that phone bans improved focus, and two-thirds reported that they improved the “social climate” in their school.
Such interventions can be expanded beyond the classroom. Before the pandemic, a business media company named Skift experimented with a ban on bringing laptops and phones to internal meetings. In an interview with CNN, Rafat Ali, the company’s chief executive, said the rule increased communication among his employees. “If you don’t have rules about laptops, people hide behind them,” he said. Such reforms might have been hard to sustain during the Covid years, but now is a good time to start exploring them again. In August the brand strategist Adam Hanft wrote an opinion essay suggesting that employees should put their smartphones in a lockbox before entering a meeting room. “Developing minds need focus,” he wrote, citing the success of school phone bans, “but so do supposedly developed ones.”
In an office setting, the incessant demands of digital inboxes and instant messages present an even bigger obstacle to fully using our brains. Microsoft’s 2025 Work Trend Index Report found that the office workers it studied were interrupted once every two minutes, on average. In 2021, I published a book titled “A World Without Email,” which argued we should aggressively transform collaboration strategies so we’re no longer dependent on a steady stream of back-and-forth messaging to accomplish work. (Looking at you, Slack.) The title of my book struck some as far-fetched — I used to joke that booksellers were shelving it in the fantasy section — but I was being serious. If we value our brains, we have to be ready to pursue profound changes to workplace culture.
Generative A.I. introduces its own set of challenges, especially when the technology intersects with our professional lives. In September a splashy article in Harvard Business Review reported on the rapid rise of “workslop,” which the authors defined as “A.I.-generated work content that masquerades as good work, but lacks the substance to meaningfully advance a given task.” The result is a contradiction: “While workers are largely following mandates to embrace the technology, few are seeing it create real value.” A recent study conducted by researchers at the Boston Consulting Group found that offloading difficult tasks to A.I. led to increased mental exhaustion — a state they called “brain fry” — because of the constant context switching required to monitor and manage the A.I.’s behavior.
Why would we use A.I. in ways that ultimately make work more draining? My suspicion is that we often deploy these tools not because they make us better at our jobs but because they help us to avoid moments of sustained concentration. It’s hard to confront a blank page, so why not coax a mediocre draft of that planning document out of a chatbot? Gathering and analyzing sources for a marketing report is demanding, so why not release a swarm of A.I. agents to tackle the task instead? The problem here is self-reinforcing. Existing brain drains like social media and email reduced our ability to think before generative A.I. arrived, making us more willing to use this new tool to avoid mentally demanding tasks once we had access to it. At the same time, the more we use A.I. in this manner, the more our cognitive fitness will continue to degrade.
Both managers and employees need to map out when it’s best to use A.I. If the technology creates significant time savings, such as when a user prompts an L.L.M. to sift through a large collection of documents or asks an A.I.-powered agent to fix formatting errors in a data set, then those are obvious wins. Indeed, the authors of the “brain fry” article found that using these tools to automate “routine or repetitive” tasks decreased burnout. But any use of A.I. that mainly serves to make core business tasks cognitively less demanding should be treated with caution. Here’s a simple rule that reinforces this idea: Your writing should be your own. The strain required to craft a clear memo or report is the mental equivalent of a gym workout by an athlete; it’s not an annoyance to be eliminated but a key element of your craft.
The problems I describe here are only going to get worse. To stave off disaster, we need a full revolution in defense of thinking, launched against the digital forces seeking to degrade it. No more shrugging (“What can you do? Kids these days just love their devices.”) or halfhearted experiments with minor tips (“Turn off notifications”) or passive acquiescence to the latest tools (“If I don’t embrace A.I., I’ll be replaced by someone who does”).
The key to this transformation is action. In the half-century that followed Eisenhower’s heart attack, age-adjusted death rates from cardiovascular disease fell by 60 percent, creating what one academic study called “one of the most important public health achievements of the 20th century.” Meanwhile, exercising has become so common as to become unremarkable. There are now more than 55,000 gyms and fitness studios in the United States alone — a reality that would have been unthinkable during the sedentary age before the publication of “Aerobics.” But Dr. White’s briefings and Dr. Cooper’s book were not enough on their own to create this transformation. It was the collective action in the wake of these events that ultimately mattered more.
In this period, the government got more heavily involved in studying and communicating new guidelines about diet and exercise, as did major nonprofit organizations, like Dr. White’s American Heart Association. Individuals and communities started experimenting as well, leading, for example, to an explosion in varieties of recreational exercise and best-selling books, like Michael Pollan’s “The Omnivore’s Dilemma,” which opened people’s eyes to more grounded relationships with food. Individual interest, in turn, led to business responses, such as the rapid expansion of fitness club and gym options and countless new health food brands. We still have a long way to go to fully address our country’s health problems, but by working together, we’ve made a large amount of progress.
I think we’re finally ready for a similar burst of self-reinforcing action in defense of our cognitive fitness. What I’ve laid out here is not a complete program to reclaim our heritage as contemplative beings but instead a useful starting point. My intention is to spur a shift in understanding that can build into a larger revolution. I’m done ceding my brain — the core of all that makes me who I am — to the financial interests of a small number of technology billionaires or the shortsighted conveniences of hyperactive communication styles. It’s time to move past fretting about our slide into the cognitive shallows and decide to actually do something about it.
We did it before. We can do it again.
Cal Newport is a professor of computer science at Georgetown University and the author of “Slow Productivity,” “Digital Minimalism,” “A World Without Email” and “Deep Work.” He’s also the host of the “Deep Questions” podcast.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].
Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.
The post There’s a Good Reason You Can’t Concentrate appeared first on New York Times.




