The older generation always discounts the workplace complaints of the younger generation. In my 20s, there seemed to be an endless supply of commentary about how we millennials were lazy and entitled, just like the members of Generation X before us were slackers. Members of Gen Z get the bad rap of being “unemployable,” because apparently they do not prize achievement for its own sake, or they’d rather be influencers because the internet has broken their brains.
Gen Z-ers don’t even deserve this perfunctory slander, because the entire process of getting and keeping an entry-level job has become a grueling and dehumanizing ordeal over the past decade.
Certainly the job market seems grim in this moment. Michael Madowitz, the principal economist at the Roosevelt Institute, described it as “an awful traffic jam.” “If you’re just out of college, you’re trying to merge into a freeway and nobody is letting you in,” he explained. Employers at companies like Airbnb and Intuit almost sound excited talking to The Wall Street Journal about staying lean and culling the number of employees they have, as long as it creates short-term profits.
But the whole experience of work for young people has been tortured for far longer than the economy has been stalled. Earlier this year, my colleague David Brooks spoke to a college senior who called young Americans “the most rejected generation,” describing the hypercompetition that has bled into all aspects of life, even for the most privileged college-educated strivers.
Because most job applications are submitted online, the bar to applying is so much lower than it was in the analog world decades ago, and so for any open role, applicants are competing with hundreds of people. The sense of scarcity and lack starts earlier, because so many selective colleges boast about their record-low admissions rates.
But now artificial intelligence is performing the first few rounds of culling, including early screening, which is further dehumanizing and gamifying the application process. Richard Yoon, who is an economics major at Columbia, told me that when his peers have multiple interviews for jobs in finance, he asks if they heard back from any of them. They tell him: “You don’t understand. Like 19 of those 20 interviews were with bots.”
It’s customary for job seekers to review their résumés for keywords they think A.I. likes, Yoon told me, so that they might have a chance of getting through the digitized gantlet and one day making human contact that could possibly lead to a job offer. Or at the very least a real-life networking connection. Yoon called the process “dystopian.”
But once you actually have a job, the real dystopia begins. Young people feel as if jobs offer far less mentorship and more micromanaging. Stevie Stevens, who is 27 and lives in Columbus, Ohio, told me that she left a full-time job in July at an exhibition design and production firm because she felt hyperscrutinized and undersupported. “Managers expect you to do six jobs in a 40-hour workweek. My company had mediocre benefits and offered little to no professional growth or training,” she told me.
Stevens also said that what she calls “surveillance state technologies” — apps that synthesized her personal data to determine her level of effort — are part of that feeling of micromanagement. Though she doesn’t have benefits through work now and deals with more uncertainty as a freelancer, she is happier because she has autonomy and control over her time and her efforts.
For the past several years, employers have used “bossware” to track worker productivity. A Times investigation in 2022 found that across professional fields and pay grades, employers were tracking keyboard use, movements and phone calls, and docking employees for time that they perceived to be “idle.”
That kind of tracking doesn’t account for things like conversations with peers, thinking — you know, with your brain — or, if you work in a warehouse, taking a rest so your body doesn’t fall apart. At least older workers knew a time before this tracking was ubiquitous, and at this point might be senior enough to have the leverage to push back against the most extreme types of surveillance.
It’s no wonder, then, that a working paper published by the National Bureau of Economic Research in July found that young worker despair has been rising in the United States for about a decade. Its co-authors, David Blanchflower and Alex Bryson, analyzed data from the Behavioral Risk Factor Surveillance System, a yearly federal health survey of 400,000 Americans, focusing on how many bad mental health days — ones described as containing “stress, depression and problems with emotions” — a worker had in the past month. They then created a mental despair measurement using the number of bad mental health days, comparing mental despair across demographic, employment and educational characteristics.
Blanchflower and Bryson found that for workers under 25, mental health is now so poor that they are generally as unhappy as their unemployed counterparts, which is new in the past several years. The rise in despair is particularly pronounced among women and the less educated. Last year, job satisfaction for people under 25 was about 15 points lower than it was for people over 55. This was true in the same year that satisfaction rose for every other age group, according to a survey from the Conference Board. The unhappiness of young workers seemed so pronounced in the past year — whether because of the rapid rise of A.I., the uncertainty of the market, or some other rancid combination of post-Covid malaise and general disaffection.
I called Bryson to find out more about why young workers are so unhappy. He has two hypotheses. One is that the perception of work satisfaction has changed: Young people expect to be happier than previous generations were, in part because they’re using social media to compare themselves to some of their peers, only to then find themselves disappointed by the tedium of their own 9-to-5s. But the other hypothesis is in line with what I’m hearing from young people: The workplace is markedly worse.
Employers might not extend the workday, Bryson speculated, but the amount of work expected in each hour is ”intensifying” because every move is captured and cataloged by employers. This makes employees feel as though they have no job control, which “is a fundamental tenet in terms of job quality, the idea that you feel that you have some degree of autonomy over what you’re doing rather than just being directed as an automaton,” Bryson said.
Gen Z-ers seem to be having a few disparate reactions to this state of play. Both Stevens and Yoon told me that they see entrepreneurship as potentially safer than corporate work at this point. Yoon told me he saw a family member spend decades at a Fortune 500 Company only to get unceremoniously laid off, and it has made him consider a less traditional path. The other is unionization. Bryson wondered if the renewed support for unionization among young people in the United States is an antidote to this misery.
Whatever is going to happen for Gen Z-ers as we all live through the A.I. revolution, I hope that their elders approach them with more compassion than disdain. At least I got rejected to my face when I was in my 20s, which now seems like a luxury I didn’t appreciate.
End Notes
-
This New York Times Magazine story about MaryBeth Lewis, who gave birth to her first child in her 20s and her 13th child at 62, and ended up in a custody battle for her 14th and 15th children, is burning up all my mom chats right now. The writer, David Gauvey Herbert, found a doozy of a tale that wrestles with questions about bodily autonomy, surrogacy and the outer limits of fertility law and family court. I won’t spoil any of it for you, but the story involves a judge named Chauncey J. Watches — a name so perfect, a novelist couldn’t come up with a better one.
Feel free to drop me a line about anything here.
Jessica Grose is an Opinion writer for The Times, covering family, religion, education, culture and the way we live now.
The post Dehumanizing and Dystopian: How Gen Z-ers See Work appeared first on New York Times.




