DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Can We See Our Future in China’s Cameras?

June 23, 2025
in News
Can We See Our

Future in China’s

Cameras?
500
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

I heard some surprising refrains on my recent travels through China. “Leave your bags here,” a Chinese acquaintance or tour guide would suggest when I ducked off the streets into a public bathroom. “Don’t worry,” they’d shrug when I temporarily lost sight of my young son in the crowds.

The explanation always followed: “Nobody will do anything,” they’d say knowingly. Or, “There’s no crime.” And then, always, “There are so many cameras!”

I can’t imagine such blasé faith in public safety back when I last lived in China in 2013, but on this visit it was true: cameras gawked from poles, flashed as we drove through intersections, lingered on faces as we passed through stations or shops. And that was just the most obvious edge of the ubiquitous, multilayered tracking that has come to define life in China. I came away troubled by my time in some of the world’s most-surveilled places — not on China’s account, but because I felt that I’d gotten a taste of our own American future. Wasn’t this, after all, the logical endpoint of an evolution already underway in America?

There was a crash course on the invasive reality of a functionally cash-free society: credit cards refused and verge-of-extinct paper bills spurned. I had to do the thing I’d hoped to avoid, link a credit card to WeChat. That behemoth Chinese “super app” offers everything from banking to municipal services to social media to shopping, and is required to share data with the Chinese authorities. (Elon Musk, by the way, reportedly wants to turn his own app, X, into an invasive offering modeled after WeChat.) Having resigned myself to all-virtual payments, I knew I was corralled like everyone else into unbroken visibility, unable to spend a single yuan or wander down a forgotten side street without being tracked and recorded.

Crisscrossing China as a chaperone on my son’s school trip, I felt that a country I’d fondly remembered as a little rough-and-tumble had gotten calmer and cleaner. A part of me hated to see it. In my own mind, I couldn’t separate the safe, tidy streets from the repressive system of political control that underpins all those helpful cameras.

The Chinese Communist Party famously uses surveillance to crush dissent and, increasingly, is applying predictive algorithms to get ahead of both crimes and protest. People who screen as potential political agitators, for example, can be prevented from stepping onto trains bound for Beijing. During the Covid pandemic, Chinese health authorities used algorithmic contact tracing and QR codes to block people suspected of viral exposure from entering public spaces. Those draconian health initiatives helped to mainstream invasive surveillance and increase biometric data collection.

It would be comforting to think that China has created a singular dystopia, utterly removed from our American reality. But we are not as different as we might like to think.

Thankfully, our political architecture lacks a unified power structure akin to the C.C.P. Americans — who tend to value individual liberties over collective well-being — have deeply embedded rights which, at least theoretically, protect us from such abuses.

But if Americans have learned one thing recently, it’s that rights we thought of as inalienable can prove perishable. We still think about surveillance as something that protects us (data-grabbing door cameras and security systems), that makes life easier (smart home systems, mapping tools, useful apps) or, at worst, that figures out how to sell us things we like (cookies, social media). Many Americans are oblivious to the porous boundary between private companies that collect our intimate details and the arms of government buying it up. As the Trump administration hardens into increasingly authoritarian methods of control, China should be a reminder that promises of safety and convenience can camouflage the machinery of political abuse.

***

As my face was getting scanned all over China, Elon Musk’s minions with the so-called Department of Government Efficiency were ransacking federal agencies to seize Americans’ data and sensitive information. Legal experts maintain that accessing this data is illegal under federal privacy laws, which broadly forbid government agencies from disclosing our personal information to anyone, including other parts of the government, without our written consent. But, in the event, neither the law nor our lawmakers protected us.

Mr. Musk’s team moved to access Social Security Administration data containing medical and mental health records, bank and credit card information, and birth and marriage certificates. This month, the Supreme Court temporarily allowed DOGE to access sensitive Social Security records. That means that DOGE staff, under the vague slogan of eliminating wasteful spending, can peruse files containing the most jealously guarded details of millions of American lives — everything from salary to addiction and psychiatric health records.

“What is this going to be used for?” asked Daniel Solove, a George Washington University law professor and the author of several books on privacy and technology. “What are the protections? Where does he have it? What will be done with it? What could be done with it in the future?

“None of these questions are answered,” he said. “There’s no transparency, no accountability, no limitations.”

Meanwhile, the data analysis and technology firm Palantir, which was co-founded by Alex Karp and Peter Thiel (another Trump acolyte), has already received more than $113 million from the federal government since President Trump took office again. Officials have told The Times that the Trump administration is using Palantir technology to help consolidate data on Americans held by disparate federal agencies so that it could potentially create a centralized dossier. In April, Immigration and Customs Enforcement announced a $30 million contract with Palantir to create a system that will give ICE “near real time visibility” of people self-deporting, and prioritize whom to deport next.

Mr. Trump’s second term has been marked by incessant talk of investing in A.I., winning at A.I., getting ready for A.I., while tech executives lavish money on Mr. Trump and jockey for favor. The president has made it clear that he doesn’t want any pesky state governments getting in the way of this sensitive, emerging technology.

All state laws regulating A.I. — dozens of them — would be nullified, and states would be banned from creating new A.I. regulations for the next decade under a measure embedded among the tax cuts and social spending cuts that the House passed in Mr. Trump’s “big, beautiful bill.” Senate Republicans have proposed replacing the ban with a measure blocking federal funding for broadband projects if states regulate A.I. It’s not paranoid to ask what Mr. Trump, tech executives and their political allies have in mind.

The government’s enthusiasm for this emerging technology is disquieting. A.I. could help to supersize the surveillance state, offering the potential to quickly synthesize and draw inferences from massive quantities of data.

“The really powerful thing is when personal data get integrated,” said Maya Wang, associate China director at Human Rights Watch. “Not only am I me, but I like these things, and I’m related to so-and-so, and my friends are like this, and I like to go to these events regularly on Wednesdays at 6:30. It’s knowing relationships, movements and also any irregularities.”

Ms. Wang mentioned Police Cloud, an ambitious Chinese public safety project that uses all manner of collected data to find hidden relationships between events and people; to spy on those considered dangerous (petitioners, dissidents, Uyghurs, people with “extreme thoughts,” according to a document reviewed by Human Rights Watch); and to combine real-time monitoring with predictions for what may be about to happen. Predictive software has been adopted by local authorities around China: A Tianjin data project designed to head off protests analyzes who is most likely to file complaints; software in the city of Nanning can warn authorities if “more than three key people” checked into a hotel.

It’s not that our government is using the surveillance infrastructure in the same manner as China. It’s that, as far as the technology goes, it could.

“People used to say, in a xenophobic way, ‘We don’t want to end up like China,’” said Caitlin Seeley George, managing director at Fight for The Future, an organization advocating rights in the digital age. “The truth is, it may be a little less visible to us, it may look a little different, but the systems are in place here to support that kind of data sharing.”

The government has also been using privately collected data to crack down on ordinary Americans — mostly, so far, in the realm of immigration enforcement, but not exclusively.

In 2023, for example, a Nebraska teenager and her mother were imprisoned after the police obtained their private Facebook messages discussing the use of abortion pills to end the teenager’s pregnancy.

In 2018 The Verge reported that Palantir (yes, Palantir again) had for years been secretly collaborating with New Orleans police to experiment with using troves of previously siloed data to identify people who were deemed more likely to commit crimes.

Since Mr. Musk started his big DOGE data grab, a spate of lawsuits has been filed by civil liberties and technology watchdogs, labor unions and state governments seeking to stop the seizures and get more information about what’s already been handed over.

The government has offered little explanation for what it’s doing with our data but, in April, Wired reported that DOGE has already started to integrate immigration data with Social Security and tax data.

This is particularly nefarious given the recent abuse of immigration enforcement. Students here on valid visas were overtly targeted because of their political speech — specifically, for participating in legal demonstrations for Palestinian rights. The State Department officials have described plans to use A.I. surveillance to comb social media posts to identify students for visa revocation. (It’s worth noting that invasive government perusal of social media is a bipartisan tendency — under President Joe Biden, for example, the Department of Homeland Security combed social media looking for discussions of abortion after Roe v. Wade was overturned.)

Surveillance and tech specialists warn: This could be just the beginning.

“Once you consolidate data in a massive way like this, where your tax records are living next to your federal contracting records and your political donation records, the opportunity for abuse is significant,” said Cody Venzke, a senior policy counsel at the American Civil Liberties Union, which is among the organizations suing the federal government for information about the DOGE data breach.

China manipulates data to create social credit scores that identify untrustworthy businesses or that allow overzealous officials to blacklist citizens for perceived vices.

Many Americans, whether they know it or not, have also been scored by state authorities, aided by ill-gotten information and predictive software.

Here’s how it happens: All those private details collected by the many apps on your phone, not to mention the smart home devices, doorbell cameras and, of course, your car — that information winds up in the hands of salespeople known as data brokers. The data brokers, in turn, frequently sell to government agencies, especially law enforcement. Police who spend our tax money to buy this data are exploiting dubious loopholes, carrying out what amounts to a search and seizure en masse, without warrant or subpoena — and it happens every day.

Some U.S. law enforcement bodies have already experimented with feeding the fruits of mass surveillance — faces, social media posts, location data and anything else they can scrounge from the data brokers — into predictive software to generate “threat scores” for individuals.

A Department of Justice report published late last year on A.I. and criminal justice sounded an enthusiastic note on software-generated risk assessments, noting that A.I. actuarial models “can outperform human judgments alone.”

“Transparency is also a concern,” the report acknowledged. “Individuals who are subject to a risk assessment tool (and their representatives) may not know that the tool was used or have sufficient information to understand how it works and how it performs. Affected individuals also may not be aware of the inputs provided to the tool or have an opportunity to correct mistakes.”

It’s not just police. Public schools across the country have enthusiastically embraced “early warning” algorithms that plumb students’ private information to score their likelihood of dropping out. Here, too, lies the problem of cost/benefit — advocates for the early warning systems say they protect struggling or at-risk children from slipping unnoticed through the cracks. But many parents have no idea that data on their children’s attendance, behavior, and test scores are being gathered and submitted to predictive software.

Even more troubling, school-collected data has sometimes made its way into the hands of law enforcement.

Somehow, in all of this, our understanding of privacy — why it matters and who needs it — seems to have slumped. The men who drafted the earliest list of American rights, having recently fought an insurgency against colonial overlords who barged into their homes and stores whenever they pleased, retained a firm belief in privacy’s outsized importance as a condition of freedom. The Bill of Rights protects a range of privacies — of the home, the body, religious belief and even — as reflected in the Fifth Amendment’s right not to incriminate oneself — knowledge and personal information.

Jeremy L. Daum, a legal scholar and senior fellow at Yale Law School’s Paul Tsai China Center, has spent years living in China and studying the country’s legal system. That work, he said, made him a witness to rapidly shifting attitudes toward privacy both in China and in the United States. He pointed out that Americans, particularly in the wake of Sept. 11, “used to talk about giving up privacy for security.”

“Now we give it up for convenience, and it seems to me that our private information is getting cheaper,” he said. “The bargain is not well earned at this point.”

Back from China, I found myself reading through the Privacy Act of 1974, and felt like I had opened a time capsule. Introducing the legislation, a result of revelations about Watergate and F.B.I. surveillance, Senator Sam Ervin of North Carolina reminded lawmakers that privacy assured that “the minds and hearts of Americans remain free.” To relinquish any bit of information to the government, he warned starkly, was to give away one’s freedom.

“The more the government or any institution knows about us, the more power it has over us,” Senator Ervin said. “When the government knows all of our secrets, we stand naked before official power. Stripped of our privacy, we lose our rights and privileges.”

It’s hard to imagine a leader of today’s Senate speaking with such lucidity about privacy. Since the terror attacks of Sept. 11, we’ve repeatedly heard our leaders denigrate the siloing of our private information as if it were an impediment — and not a critical safeguard meant to protect us from the government. An executive order from Mr. Trump explicitly identifies information silos (in other words, the time-honored and legally mandated practice of federal agencies storing people’s private information secure from view, including by other parts of the government) as a source of “waste, fraud and abuse.”

The cultural shift is, perhaps, as insidious as the surveillance itself. We know, on some level, that we are already exposed before invisible watchers. We are hooked on the tech that comes with it, and we think we can’t change it.

But we can. While Congress and the federal government have, so far, remained feckless against the excesses of surveillance, state and local officials have shown a little more spine.

Just last month, Montana became the first state to close what’s called the “data broker loophole,” restricting the government from buying private information about people — a protection that still, despite years of legislative efforts, doesn’t exist at the federal level. At least 20 states have enacted comprehensive consumer data protection laws, and many cities have tried to prevent the use of facial recognition technology — although the police sometimes worked around the ban by outsourcing to neighboring law enforcement offices.

The companies getting rich off building a surveillance state aren’t going to announce their intentions. Our lawmakers aren’t going to come out and say that, if their voters don’t notice or care, it’s easier for them to avoid confronting the powerful executives and leaders experimenting with ways to spy on us.

Mr. Trump and his tech cronies are charging ahead fast. If we keep sleepwalking into a surveillance state, we may eventually wake up in a place we hardly recognize as our own.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].

Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.

Megan K. Stack is a contributing Opinion writer. She has been a correspondent in China, Russia, Egypt, Israel, Afghanistan and the U.S.-Mexico border area. Her first book, a narrative account of the post-Sept. 11 wars, was a finalist for the National Book Award in nonfiction. @Megankstack

The post Can We See Our

Future in China’s

Cameras? appeared first on New York Times.

Share200Tweet125Share
Big Banks Have Trump’s Vision for Crypto in the Crosshairs
News

Big Banks Have Trump’s Vision for Crypto in the Crosshairs

by Breitbart
June 23, 2025

The following article is sponsored by Public Policy Solutions and authored by  John Czwartacki, a principal at Public Policy Solutions. ...

Read more
Design

Evicted from her apartment at 68, an artist starts anew in a sunny L.A. fourplex

June 23, 2025
News

Democrats to Protest Trump’s Takeover of Kennedy Center With Pride Event

June 23, 2025
Europe

London to Sweden for the day: These travelers are embracing extreme day trips

June 23, 2025
News

Trump sues to end college tuition benefits for undocumented students. Could California be next?

June 23, 2025
Comedy and crime fighting join forces for police learning leadership skills

Comedy and crime fighting join forces for police learning leadership skills

June 23, 2025
US citizens urged to ‘exercise increased caution’ worldwide

US citizens urged to ‘exercise increased caution’ worldwide

June 23, 2025
Goodbye to Berlin: New Novels Recall a City’s ‘Poor but Sexy’ Heyday

Goodbye to Berlin: New Novels Recall a City’s ‘Poor but Sexy’ Heyday

June 23, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.