It happened to an electrical engineer from New Hampshire, a medical researcher at Harvard University, and an aging auntie from Seattle—all of them permanent residents. They were each returning home to the United States from an ordinary trip abroad when they were pulled aside by immigration agents, subjected to a lengthy interrogation, and then taken into custody and transferred to a detention facility miles away from home. Now they face an enormous, crushing bureaucracy that uses minor or long-forgotten infractions to keep them under indefinite detention.
This type of encounter is not new, but it is headline news in 2025. It also happens to be how my dystopian novel, The Dream Hotel, opens. Set in a future of total technological surveillance, the book follows an American archivist who is detained at Los Angeles International Airport because an algorithm has used her dreams and behavior to predict that she will commit a crime. One review called it a “Trump-Era Update” on Philip K. Dick’s The Minority Report. Another credited its “eerie sense of prescience.” When I was on tour for the book last month, someone asked if I’d known that the twice-impeached president and convicted felon would return to power.
I hadn’t. I started working on The Dream Hotel in 2014, during Barack Obama’s administration, and wrote the bulk of it during Joe Biden’s term in office. I had no idea Donald Trump would run for president in 2016, and after he lost in 2020, I didn’t expect he’d be reelected. I was thinking instead about the ever-more-invasive forms of data collection that Big Tech had unleashed. I wondered if, one day, one of their devices might target the subconscious. The novel takes U.S. systems of surveillance and incarceration that have been deployed at the southern border or on foreign soil—and applies them to Americans.
In writing about this potential future, I found inspiration in history. Surveillance has always been a part of the human experience, because it’s one of the mechanisms that enables power to be exercised and enforced in society. “No creature is hidden from His sight,” the Bible says, “but all are naked and exposed to the eyes of Him to whom we must give account.” The Quran warns, “God is all-knowing.”
Omniscience is not confined to the realm of religious belief. Authoritarian systems share in the idea that, even if you’re hidden behind the walls of your own home, someone might find out that you said the wrong thing or read the wrong books or met with the wrong people, and punish you for your transgressions. During the Cold War, East Germany’s government employed a sprawling network of informants whom it equipped with state-of-the-art technology in order to spy on the population. On a visit to Berlin’s Stasi Museum in 2023, I was struck by the range of everyday objects that could be used to conceal miniature cameras—a checkered tie, a jacket button, a watering can. The secret police even endeavored to create an archive of scents, by inducing suspects to touch yellow cloths and saving these in hermetic glass bottles. The Communist Party used this elaborate surveillance system to consolidate its power and crush political dissent for 40 years.
The United States has a long history of surveillance as well. The FBI famously spied on civil-rights activists, Black Panthers, feminists, Vietnam War protesters, and other leftist groups through programs such as COINTELPRO, which used wiretapping and mail interception to keep tabs on people it considered “subversive.” This gave the Bureau access to information it could then use to disrupt their activities or sow division among them. The program cast a wide net. Martin Luther King Jr., Malcolm X, and Angela Davis were surveilled, as were Bobby Seale, Tom Hayden, and Jane Fonda. In addition to mechanical data collection, the agency also relied on information collected by informants and undercover officers.
For all its power to harm, though, surveillance can also take forms that almost everyone would agree are benign, or even beneficial. For example, medical doctors have a range of tools at their disposal to track patients’ heart rates, brain waves, or blood-glucose levels. The Federal Aviation Administration routinely conducts random drug and alcohol testing of its pilots and crews to ensure that they can fly safely. We watch young children when they play on the monkey bars, and keep a close eye on elders when they grow too frail or incapacitated to care for themselves.
Big Tech’s insidious hold on our lives comes from the fact that it combines both ends of this surveillance spectrum. Our devices deliver services that are highly protective (receiving a text alert each time a financial transaction affects a bank account, for example) as well as potentially abusive (making our political speech or our geographic movements available to, say, a police officer or an immigration agent). Technology companies are careful to present the equation as balanced, with convenience and connection on one side and collection of granular information on the other, so it is much harder for users to simply stop using their devices. In the early years of the internet, many people thought their data would be used only for targeted advertising.
By 2014, when I began working on my novel, the unholy alliance between Big Tech and the government was becoming apparent. Edward Snowden had revealed the existence of PRISM, a mass-surveillance program that the National Security Agency operated in partnership with tech companies such as Apple, Facebook, Microsoft, and Google. PRISM was authorized under the PATRIOT Act, and although officials maintained that its targets were foreigners, the communications of Americans were routinely collected as well. A friend of mine, an avowed liberal, shrugged it off; he had nothing to hide, he said, and he trusted that then-President Obama would do the right thing. But even if you conceded that Obama could be trusted with the data—which I didn’t—what would happen if this surveillance apparatus were run by someone else?
The Snowden disclosures led to a monthslong national debate about privacy, but that eventually died down, and the program continued to operate. Still, its potential for abuse stayed with me. I grew up in Morocco in the 1970s and ’80s, a period of state repression, kidnappings, and disappearances that came to be known as the Years of Lead, so I knew well what could happen when a government set its sights on an individual it found suspect or troublesome. A popular joke at the time went something like this: The CIA, the FBI, and the Moroccan police enter into a friendly contest. The Secretary of the United Nations releases a rabbit into the woods and asks them to catch it. The FBI places informants in the forest and, when it can’t find the rabbit, concludes that it was never there. The CIA hits the forest with heavy artillery, then announces that the rabbit is dead. The Moroccan police go in and bring out a fox with two black eyes. “Okay, okay,” the fox says. “I am a rabbit.”
Growing up under state control made me hypersensitive, decades later, to the dangers of technological surveillance. Tech companies have access to an ever-growing and highly detailed archive of our lives: our texts and emails, our pictures, our habits and movements, our cultural tastes and political opinions. In The Dream Hotel, I wanted to explore a world where privacy as we know it has ceased to exist, and Big Tech’s alliance with the government has led to indefinite detention for pre-crime.
Since the novel came out, friends have been sending me stories in the news. Like a Guardian report about how the U.K. government commissioned the development of a homicide-prediction algorithm. Or a CNN piece about how the State Department considers the “expected beliefs, statements, or associations” of Mahmoud Khalil, the Columbia graduate and green-card holder currently being held in a Louisiana detention center, to be sufficient reason for his deportation. Or a Rolling Stone article about how the Trump administration might pursue denaturalizing American citizens and sending them to El Salvador. Then there is the New York Times story about how Elon Musk is leading efforts to create a giant government database that merges information from all existing federal records. Under this scheme, the personal, legal, financial, housing, educational, and employment information of every American would be centralized. (In my novel, this is called the OmniCloud.)
I thought I was writing about a time 20 or 30 years into the future. I didn’t foresee that in 2025, an unelected billionaire would have his underlings enter federal agencies over staffers’ objections and—according to an official whistleblower report—just copy the private data of millions of citizens. Nor did I imagine that the acting director of ICE would bluntly state his vision of a deportation force that operates “like [Amazon] Prime, but with human beings.”
But the point of a speculative novel isn’t to see what a writer got right or wrong about the future. A speculative novel isn’t even about the future, exactly, but about an alternative world in which our anxieties about the present moment are on full display. What if we faced a society-altering epidemic? (The Plague, Blindness.) What if the planet warmed? (Parable of the Sower.) What if we could clone ourselves? (Never Let Me Go.) What if some words and ideas were forbidden? (The Memory Police.) What if the government outlawed books? (Fahrenheit 451.)
We don’t put firefighters in charge of burning books—at least not yet—but Ray Bradbury gave us language to speak about the freedom to read and showed us how to notice threats to it. My hope is that readers will open themselves to the emotional experience of The Dream Hotel. And yes, maybe they will also think about the data they so easily and so frequently relinquish.
The post What Fiction Can Predict, and What It Can’t appeared first on The Atlantic.