Since “Black Mirror” debuted in 2011, the dystopian sci-fi anthology series has taken seeds of nascent technology and expanded them to absurd and disturbing proportions.
In doing so, it has become a commentary on defining issues of the 21st-century: surveillance, consumerism, artificial intelligence, social media, data privacy, virtual reality and more. Every episode serves in part as a warning about how technological advancement run rampant will lead us, often willingly, toward a lonely, disorienting and dangerous future.
Season 7, newly available on Netflix (the streamer acquired the show from Britain’s Channel 4 after its first two seasons), explores ideas around memory alteration, the fickleness of subscription services and, per usual, the validity of A.I. consciousness.
Here’s a look back at a few themes from past episodes that seemed futuristic at the time but are now upon us, in some form or another. Down the rabbit hole we go:
‘Be Right Back’
Season 2, Episode 1
A.I. imitations, companion chatbots and humanoid robots
When Martha’s partner, Ash, dies in a car accident, she’s plunged into grief. At his funeral, she hears about an online service that can help soften the blow by essentially creating an A.I. imitation of him built from his social media posts, online communications, videos and voice messages.
At first she’s skeptical, but when she finds out she’s pregnant, she goes through with it. She enjoys the companionship she finds by talking with “him” on the phone and starts neglecting her real-life relationships. She soon decides to take the next step: having a physical android of Ash created in his likeness. But as she gets to know “him,” a sense of uncanny valley quickly sets in.
The same year that this episode aired, 2013, the concept was also the focus of Spike Jonze’s Oscar-winning movie “Her.”
These days, A.I. companionship is quickly on the rise. Services like Replika have millions of users. Replika started when its founder, the A.I. leader Eugenia Kuyda, lost her best friend. After his death, she fed their email and text conversations into a language model, and in a way resurrected him via chatbot. Last year, Kuyda told the Verge that being “married” to your chatbot isn’t necessarily a bad thing.
In January, a New York Times story titled “She Is in Love With ChatGPT” explored the depths that people are bonding to their artificial companions, the lengths to which these partners can be customized and the ways these relationships can isolate users from their real lives.
“Within the next two years, it will be completely normalized to have a relationship with an A.I.,” Bryony Cole, the host of the podcast “Future of Sex,” said in an interview for the article.
‘Metalhead’
Season 4, Episode 5
A.I. control problems, drones and autonomous robots
When this episode aired in 2017, Boston Dynamics had already created its four-legged mobile robot referred to as a “dog,” a muscular Terminator-like entity that inspired the episode.
In “Metalhead,” Maxine is being hunted in a postapocalyptic hellscape by similar robot dogs that have seemingly malfunctioned and are now fixated on tracking and destroying humans. The sophisticated killing machines can’t be outsmarted for long and, are stunning in their ingenuity, relentlessness and efficiency.
Boston Dynamics has continued evolving its products, including the creation of humanoid robots that can even dance. The company’s Spot model of a robotic dog has been available for purchase for a few years, but when the New York Police Department implemented the machine in 2021, fierce backlash ensued, quickly cutting its run short. Now the city’s fire department uses two for precarious missions.
But most of all, the episode serves as an allegory on increasingly urgent anxieties around autonomous A.I. and control issues as they relate to the use of drones, whether they’re delivering packages or engaging in warfare.
In March, The Times’s tech columnist Kevin Roose made a chilling point: In the next year or two, there’s a very real possibility that artificial intelligence will end our species’ monopoly on human-level intelligence — and that we are completely unprepared for it.
‘White Bear’ and ‘Shut Up and Dance’
Season 2, Episode 2 and Season 3, Episode 3
Online vigilantism and social media spectacle
These two episodes arguably deliver the most memorable twist endings of the series.
In both stories, protagonists are being tortured in one way or another, and viewers, compelled to feel sympathy, don’t learn until the end that these characters are in fact being punished for crimes against children.
Themes around vigilantism, the genre of true crime, the appetite for spectacle and desensitization to violence — and technology’s affect on it all — wind their way through.
These episodes, from 2013 and 2016 respectively, foreshadowed the rise in online vigilantism.
A Times investigation published last month illuminated the evolution of vigilante pedophile hunters on loosely moderated social media platforms, a movement that has accelerated over the past two years.
The analysis found that these hunters chase, beat and humiliate their targets — with a surge of violent content posted in just the past year. The content caters to young men, and commenters often cheer on the violence and even suggest new methods of torture.
This phenomenon of pedophile hunters stands out because it adopts “a social media influencer model, using real-life violence to build a following online,” the report states.
‘Arkangel’
Season 4, Episode 2
Child tracking and behavior monitoring
There’s a popular meme about millennial kids that reads: “We memorized phone numbers. We memorized driving directions. No one knew what we looked like. No one could reach us. We were gods.” That freedom to exist unmonitored seems unthinkable today.
In this episode, Marie is shaken after briefing losing her young daughter, Sara, in their neighborhood, so she signs up to have a cutting-edge device implanted into Sara via a service called Arkangel. The implant includes location tracking and medical data collection, as well as an audiovisual feed from Sara’s perspective that allows Marie to blur whatever she deems too distressing for her daughter (like sexual or violent images).
What unravels from there is a story of a relationship manipulated, warped and destroyed by the technology. In the end, Marie’s compulsion to monitor and interfere in Sara’s life as she comes of age ends up being the reason their relationship falls utterly apart.
These days, just about everyone is tracked, including (and maybe particularly) children and teenagers. Apple’s Find My Friends app and Apple AirTags, which are intended to help locate objects like keys and bags, are common ways to monitor a person.
A simple Google search will serve up numerous lists titled “the best GPS trackers for kids.” Likewise, we now have smart watches that monitor heart rate, oxygen levels and more. Last year, the Google-owned brand Fitbit introduced a smartwatch specifically for children. There’s also Gizmo, Wizard Watch and TickTalk.
In 2020, The Times columnist Jessica Grose warned parents about these tools, arguing that they hamper little ones’ road to independence, preventing them from feeling truly free.
Yet the digital umbilical cord is becoming harder to severe even when children go to college. Apps like the popular Life360 allow parents to get updates and alerts about the granular details of a young adult’s behavior.
“I cannot take it anymore,” reads a post on Reddit about Life360, prompting hundreds of replies and thousands of up-votes. “It’s not worth the crying and panic attacks you will cause your child.”
‘Fifteen Million Merits’
Season 1, Episode 2
Screen dependence, inescapable ads and A.I. followers
In this episode, a fan favorite that helped establish the series, Daniel Kaluuya stars as Bing, a young man who lives in a society where people must cycle on stationary bikes to earn merits, a type of currency, in order to pay for everyday costs (insert all metaphors about the grind here). He also lives in a room that’s encased by screens on which he can play video games and watch shows. The screens wake him up every morning.
If Bing tries to look away from an advertisement — and doesn’t have enough merits to skip it — he’s met with a piercing sound and a voice that repeats “resume viewing” until he opens his eyes. The plot point serves as precursor to the subscription tiers that many streaming services employ today, in which you can only opt out of ads for a price (and sometimes not at all). As for ads pausing until they have your attention, that’s increasingly the case, too.
But it’s the episode’s virtual talent show, “Hot Shot,” with its artificial audience, that has come back around. During the pandemic, virtual audiences were installed for “America’s Got Talent” and “Britain’s Got Talent,” and artificial crowd noise was applied to televised sporting events, dividing viewers.
Now, nearly 14 years after the episode aired, there’s an app called Famefy that allows users to assemble millions of A.I. bots that simulate devoted followers and cheering fans. It’s an immersive alternate reality that replicates being social media famous, even if no one is real but you.
In an interview this month on the Times columnist Ezra Klein’s podcast, the social psychologist Jonathan Haidt — author of the hugely popular “The Anxious Generation” — called Famefy “one of the most disgusting apps I’ve ever seen.”
“This is the most ‘Black Mirror’ [thing] I’ve ever heard,” Klein, using stronger language, replied.
Maya Salam is an editor and reporter, focusing primarily on pop culture across genres.
The post ‘Black Mirror’ Showed Us a Future. Some of It Is Here Now. appeared first on New York Times.