The best and worst developments in public health have always come from moments of crisis.
In 1937, when the Food and Drug Administration was still a tiny, toothless backwater unable to enforce even basic safety standards for the products under its purview, a new medication called Elixir Sulfanilamide killed 100 or so people in the space of a few weeks. It might have been a minor incident. Fake and faulty medicine was rampant at the time, and accidental poisonings were not uncommon. But many of the elixir victims were very young children, and agency officials wasted no time spinning the incident up into a national crisis. They had already spent decades arguing for tougher laws to hold drug makers accountable. Now, with an anxious and angry populace rallied to their side, they pressed their case — and prevailed. Within a year Congress had passed the Food, Drug and Cosmetic Act, a regulatory statute that changed the practice of medicine forever.
Some 40 years later, the Centers for Disease Control and Prevention faced a similarly pivotal moment when several soldiers at Fort Dix contracted swine flu, and one died. Anxious to head off a pandemic and eager to demonstrate their institutional might, agency officials launched a bold but hasty initiative to vaccinate every American against the new virus as quickly as possible. The public grew skeptical of the effort when the vaccines were linked to an extremely rare but serious side effect. And when the threat of a deadly disease outbreak proved vastly overblown, they were outraged. Why foist an untested shot on an entire nation, for a virus that appeared to have sickened just a dozen or so people?
“It was supposed to be this great triumph,” says Joshua Sharfstein, a professor at the Johns Hopkins Bloomberg School of Public Health and author of “The Public Health Crisis Survival Guide.” “But it ended up seeding a generation of vaccine hesitancy instead.” The takeaway from these and similar parables is clear, Dr. Sharfstein says: Crisis can be a powerful catalyst for shaping policy and improving society. But just like any such tool, it can be misused as easily as used.
If that lesson isn’t new, it’s very much worth reviewing now. The United States is in what can only be described as an epoch of crisis. There is no quarter of American life that has not been claimed by the term, from the planet (climate) to the Republic (democracy, migration, housing) and the deepest chambers of the human heart (loneliness, despair). In the future, if we survive that long, historians will marvel at either our capacity to endure so much hardship at once or our ability to label so many disparate problems with the same graying word. In the meantime, officials and policymakers — and, yes, journalists — ought to consider how they employ this term and why, and whether it’s having the desired effect.
There’s no better place to start than with public health. Two centuries back, when infectious disease outbreaks still routinely devastated the nation’s most populous cities, the practices needed to protect the health of whole communities were an accepted (occasionally celebrated) part of the social contract. But public health has long since fallen victim to its own success: As plagues receded and life expectancy rose, support for the initiatives that made those achievements possible waned. And as clinical medicine improved, health itself became a matter for individuals, not for society at large.
Since the turn of the previous century, the national approach to public health has been governed by a cycle that experts refer to as neglect, panic, repeat. Elected officials ignore the nation’s public health apparatus — they starve it of funding and isolate it from the larger, more stable health care system — until a crisis or panic of some kind emerges. Then they flood that apparatus with resources, and a mad scramble begins not only to resolve the current crisis, but also to repair the many flagging structures most essential to that effort. Public health experts like to call this building the plane while flying the plane.
Then, when the crisis abates, the neglect resumes.
Nearly five years out from the beginning of Covid, the most substantial and straining turn of this cycle, its central and most damning paradox is clear: The nation’s public health apparatus is reliant on panic and outrage as a tool for addressing basic problems. But the nation itself is spent from so much panic and outrage.
It’s common to think of crises as events beyond the whims of human interpretation. They cannot be invoked, orchestrated or even denied. They are external and inevitable, almost by definition: They happen and we respond to them. Full stop.
But sounding an alarm (or sending up a bat signal, if you prefer that image) is always a choice. And it’s one that reflects nothing so much as our own values and fears. Elixir Sulfanilamide killed more Black men than it did white children, but it was only the white children who made it a crisis. The crack epidemic was a crisis of criminal justice and the opioid epidemic one of public health, largely because of which communities were afflicted by each.
Even a deadly disease is only a crisis when we treat it like one. If that’s difficult to believe, keep in mind that heart disease has killed nearly 100 million people in the time it took Covid to claim seven million or so.
In public health, at least, there is no objective formula for determining when a problem becomes a crisis. Emergencies and epidemics are governed by formal declarations and precise definitions. Crises are a much fuzzier beast. It’s up to us (officials, journalists, individuals) to decide when or whether that threshold has been crossed and, if so, what to do about it. And those decisions have always been fraught. Employing what Dr. Sharfstein refers to as “the language and urgency of crisis” can draw attention to neglected problems and clear the path to better health policies that help alleviate suffering. But it can also just as easily make matters worse.
History is chock-full of both examples. It was in moments of crisis that Americans passed laws to protect our air and water, enacted policies that repaired the ozone layer and established programs to monitor infectious disease threats around the world. Crisis gave us the F.D.A., the C.D.C. and the Environmental Protection Agency. But it also led to racist and ineffective quarantines. It gave rise to vaccine hesitancy. It helped enshrine a war against obese people (and another against those who struggle with addiction) that has yet to fully abate.
Our current roster of apparent public health crises is both sprawling and haphazard. Mass shootings make the list, but everyday gun violence seldom does. Adolescent angst and parental well-being have been elevated to five-alarm fires, but serious mental illness and the enduring lack of resources to treat it have not been. Obesity is a crisis, but heart disease, which is a far greater cause of preventable death, seems not to be. Last year, the United States surgeon general described loneliness as a crisis so dire, it threatens to “rip our country apart.” But neither he nor any other federal health official has raised comparable alarm about the two million or so people across the country who don’t have reliable access to clean drinking water, or the 47 million who regularly go to bed hungry.
All of these issues deserve attention and resources, to be sure. But when it comes to discussing their relative urgency and import, we need either a new way or a new word.
There is no shortage of ways in which a crisis can lead a society astray. Obvious solutions are overlooked in a panic, or wiser courses abandoned, or a cascade of unintended consequences unleashed.
For just one example of how this plays out, consider the maternal mortality crisis. Doctors and health officials have spent years sounding the alarm over what looked like a persistent rise in maternal deaths that began in the early 2000s. But as a recent study and a seminal article in The Atlantic make clear, that uptick was an artifact due almost entirely to changes in the way such cases are defined and tracked. Maternal mortality is still a problem, but it is not growing in the way Americans have been led to believe.
This misapprehension may seem beside the point. The United States still has one of the highest maternal mortality rates in the developed world, regardless of whether the problem is growing. And that burden is still being shouldered disproportionately by the nation’s Black mothers, who die at nearly three times the rate of their white counterparts. But a “steady rise” is a fundamentally different problem from an “unequal decline” or even “stubborn persistence,” and it stands to reason that valuable resources were wasted as policymakers chased wild geese and caught red herrings.
It also stands to reason that the alarm sounding itself has saddled prospective Black mothers, especially, with undue terror. The racial gap in maternal deaths is significant and shameful, and merits every effort to close it. But in 2022, less than one–half of one-tenth of 1 percent of Black women who became mothers or expecting mothers (253 out of 511,439) died from maternal causes. That means that for all our health care system’s many failings, having a baby in the United States is still exceptionally safe, even for women of color.
The extent of such fallout remains to be seen, but much of it could have been avoided, had the alarm-sounders only looked more carefully at the data (which at least some epidemiologists were questioning from the outset) and then tempered their message accordingly.
Maternal mortality is not the only such case study. Back in 2016, when health officials reported that life expectancy was declining in the United States for the first time in decades, some economists attributed the decline to so-called deaths of despair, a new category of horror that included suicides, drug overdoses and alcohol-related liver disease, all of which were said to be rising among white men who were not college educated. The analysis was debunked fairly quickly by other economists (and more recently by journalists), who pointed out a multitude of flaws, including that the drops in life expectancy and deaths of despair did not actually overlap (they occurred at different times and in different groups of people).
But the idea — a plague of despair so great that it was bringing the national life expectancy down — was too compelling to dispel with facts. Instead, despair became the go-to explanation for everything from shorter life spans to the rise of Donald Trump.
It’s clear now that while no one force or factor can account for the country’s shifting politics, declines in life expectancy are better explained by heart disease than by any vast plague of despair. As with maternal mortality, the difference matters: Policymakers and health officials might not know how to tackle despair or whatever broad economic forces they believe caused it, but heart disease has several straightforward solutions.
The obesity crisis has also brought its share of unintended consequences. Alarm bells have almost certainly nudged more people to eat healthier foods. They also helped spur the development of effective anti-obesity medications. But they have not touched off any meaningful effort to repair our food system, which most experts agree is the root cause of expanding waistlines. “Obesity did not reach epidemic proportions because of changes in human nature or human willpower,” says Tom Frieden, who served as C.D.C. director under the Obama administration and is now president of the public health nonprofit Resolve to Save Lives. “What changed is that our environment became far more conducive to weight gain.”
What crisis vibes have managed to accomplish is to normalize fat-shaming, especially among doctors. Shame is a deeply ineffective way to resolve any health crisis, but it has proved especially counterproductive and cruel when it comes to weight loss. As decades of data make clear, dieting — the only strategy that most people have to accomplish this vaunted goal — rarely works. And as any thoughtful doctor will tell you, weight is an unreliable marker of health. Depending on which studies you look at, some 30 percent to 60 percent of those who qualify as obese are in fact metabolically healthy.
But thanks to a sustained stigma offensive, people who struggle with their weight are now facing a different health challenge: They are less likely to be offered crucial health screenings, more likely to have legitimate health concerns dismissed by doctors and increasingly likely to avoid the health care system altogether. If the goal was to improve the health of the obese, the strategy of panic-then-blame has done worse than merely fail, it has cost us crucial ground.
Nowhere is the penchant for crisis on fuller or more puzzling display than in the office of Vivek Murthy, the United States surgeon general under both the Obama and Biden administrations. When Dr. Murthy used his platform to sound the alarm about a youth mental health crisis in 2021, the nation took notice. When he used similar terminology to decry a plague of loneliness in 2023, some observers raised their eyebrows. And when he did it again in 2024 to express concern about parental well-being, many critics rolled their eyes and yawned.
Some warned that these goofy, gimmicky initiatives would undermine Dr. Murthy’s more credible efforts (including a crucial one to tackle gun violence). Others argued that the trends in question were neither new nor growing, nor necessarily a cause for panic.
But the biggest problem, according to Nat Kendall-Taylor, the chief executive of the FrameWorks Institute, a communications think tank, is how all the noise wears people out. “There’s this expectation that, ‘If only people knew how bad the problem was, they would trip over themselves, running to support my initiative,’” Mr. Kendall-Taylor says. “And the data are really clear that that assumption is incorrect.”
The reasons have less to do with the wisdom or folly of any given crisis than with the way the human brain works. “The way we frame a message impacts the way people hear and react to it,” Mr. Kendall-Taylor says. The crux of this exchange is what social scientists refer to as mind-sets, discrete sets of assumption and belief that guide our thinking and decision-making: Different messages conjure different mind-sets, which in turn lead people to make different choices. For example, Mr. Kendall-Taylor and his colleagues have found that messages built around helping “vulnerable” groups often activate a mind-set known as otherism, which can make people less likely to support interventions that cost money. But framing the same issue in terms of social progress and future prosperity triggers a sense of collectivism, which can increase support for the same interventions.
Crisis framing tends to activate a mind-set known as fatalism (a sense that the world itself is beyond repair), which in turn makes people apathetic and resistant to change. “We tire very quickly of being told that everything is on fire,” Mr. Kendall-Taylor says. A far better strategy for instilling urgency or inspiring action is to show people that real solutions lie at the ready — that change is not only desirable but also eminently possible.
Dr. Sharfstein, who teaches a course on the intricacies of public health crisis response, agrees that drumming urgency without offering real solutions is fairly pointless. But the overuse of “crisis” is only half the problem, he says. For every official who cries crisis too much, there are several who do their best to avoid using the term even when it’s clearly warranted. “There’s a lot of risk to owning a crisis, including that you will be blamed if it isn’t resolved,” he says.
What Dr. Sharfstein wants his students to understand is that in public health, at least, “crisis” is not something to avoid or to employ with abandon. It is a tool, and it works better in some situations than in others. “It’s not about the size of the problem you’re trying to solve, or even about some strict cap on the number of times you can play the crisis card,” Dr. Sharfstein says. “It’s about whether or not your situation is amenable to this particular tool. Do you have a story to tell that will inspire urgency? A workable solution? The data to make your case? If you don’t, hitting the panic button isn’t likely to help much.”
However tall that order might sound, there is no shortage of examples of what it looks like when done right.
In the early 1970s, when New Yorkers were still dumping hundreds of millions of gallons of raw sewage into the Hudson River every single day, Congress united to override a presidential veto and pass the Clean Water Act, which saved that river and countless others. In the 1980s, when scientists detected a hole in the ozone layer, the world rallied to phase out the use of chlorofluorocarbons, which were being used in everything from refrigeration to packaging material and hair spray.
And in the 1990s, when tobacco companies were still considered untouchable in Washington and tobacco products were still regularly sold to children, the F.D.A. commissioner David Kessler embarked on an epic quest to classify nicotine as a drug. Neither smoking nor any of its related health problems were new. But by building a painstaking and highly public case against cigarette makers, Dr. Kessler managed to ignite a fresh sense of outrage, which led in turn to much stronger tobacco regulations.
These problems — smoking, water, air — were as outsize as loneliness. But the efforts to fix them were concerted. The messaging was clear, the solutions were obvious and the public urgency, once awakened, was not wasted.
There is a frustrating duality to the effects of crisis-mongering on the collective psyche. It may dull our sense of what’s possible, but it also primes the populace with anxiety, and a perpetually anxious society is a vulnerable one. There is no telling who we might surrender our judgment to or what portion of the social contract we might agree to scrap, if only to silence the alarm bells. And if we’re careless in our desperation, or unlucky, those compromises could prove more fatal than any initiating crisis by itself.
It has happened before.
As the historian Eliah Bures notes in a 2020 Foreign Affairs essay, when suicide rates abruptly rose in the Weimar Republic, every societal faction seized on the uptick to make its own case about what was wrong with society and how best to fix it. “For the Nazis, suicides highlighted how ordinary Germans suffered from the nation’s humiliation under the punitive Treaty of Versailles,” Mr. Bures wrote. “Communists invoked suicides as proof of capitalism’s dehumanizing impact on workers. According to liberals and Social Democrats, suicides attested to the deleterious effect of an authoritarian school system. And traditional conservatives appealed to suicides as a sign of the breakdown of religion and family life.”
For all their obvious differences, Mr. Bures writes, those arguments shared something fundamental: a sense that the present was deeply fraught and that the faster we ushered in the future, the sooner all our problems would be solved. Historians would later note that the increase in Weimar suicides was actually pretty small, but as Mr. Bures implies, the crisis itself never really mattered. “What killed the Weimar Republic, Germany’s first liberal democracy, was not an objective predicament,” he writes, “but the fear and desperation of runaway crisis-consciousness.”
The surest way to avoid a similar fate is to strive for a system that is less crisis-dependent overall and more thoughtful and consistent about when and how to hit the panic button. Those are big, broad goals, but at least some of the things it will take to accomplish them are clear.
It starts with identifying the nation’s top public health priorities and then providing steady, consistent funding to address them. One idea with bipartisan support is to create a separate “health defense operation” budget for a core set of infectious disease priorities, so that state and federal agencies don’t have to wait for a crisis to take action. “Instead of providing $5 or $10 or $100 billion every five to 10 years,” Dr. Frieden of Resolve to Save Lives says, “why not provide like $2 billion every year, for a set of well defined priorities that Congress can decide on?”
Further crisis-proofing will require officials to start distinguishing between what the former C.D.C. director William Foege has called the immediate and the important. That means paying as much attention to the roots of a given problem as to the hair-on-fire consequences. Want to conquer lung cancer? Focus on curbing cigarette consumption. Alcohol abuse? Look at the policies that govern alcohol sales. Chronic health conditions? Confront the original sin of employer-based health insurance and find a way to make health care itself universal.
Reforms that incentivize prevention, and ones that curb the influence of special interests, would also help. For just one glaring example of why and how, consider heart disease: It’s the leading cause of death in the United States, and conquering it would save more lives and money than any number of promising health interventions or crisis clarion calls combined. Doctors have the tools, but despite decades of high-profile initiatives and steady hand-wringing, fewer than half of people with high blood pressure in the United States are treated successfully.
The problem, according to Dr. Frieden, is that hospitals still make a lot more money treating the worst outcomes of heart disease (strokes and heart attacks) than primary care doctors do preventing them. “No doctor wants their patient to have a heart attack or stroke,” Dr. Frieden says. “But the economic incentives determine that that is what will happen most of the time.” Efforts to address the issue through food policy (by reducing the salt or sugar content of certain foods, for example) have been consistently thwarted by industry.
Upending these dynamics is the key to improving Americans’ health.
In the meantime, those tasked with setting health priorities or fighting for resources to address health problems would do well to remember that it can be dangerous to seed so much anxiety; that when everything is a crisis, nothing can be; and that while crisis itself can be a powerful tool, it works best when used wisely.
Those of us with far less control over policy priorities or health agendas ought to remember the same. Panic is understandable. It can seem like the only bulwark against indifference. But there are many dark angels at our door now. Resisting them will require a better strategy than that.
The post ‘We Tire Very Quickly of Being Told That Everything Is on Fire’ appeared first on New York Times.