This is Atlantic Intelligence, a newsletter in which our writers help you wrap your mind around artificial intelligence and a new machine age. Sign up here.
Perhaps the most important element of biology to understand is our own cells. If scientists could easily predict how a mutation, virus, drug, or any other change would affect a cell, and in turn all the tissues and organs it serves, they could rapidly unlock new vaccines and drugs. Multiple cell biologists recently described this to me as a long-standing “holy grail” of their field.
But human cells are also among the most difficult things to study. Our bodies consist of tens of trillions of interacting cells, each of which has its own complex internal machinery. Scientists can’t come close to replicating that world in a lab, and have struggled to do so with computers, as well.
That may be changing. In recent decades, scientists have collected troves of DNA and microscopic imaging data from human cells—and now they have a tool, generative AI, that might make sense of all that information. “Much as a chatbot can discern style and perhaps even meaning from huge volumes of written language, which it then uses to construct humanlike prose, AI could in theory be trained on huge quantities of biological data to extract key information about cells or even entire organisms,” I explained in a story this week.
The research is in its early stages, and full-fledged, AI-driven “virtual cells” may never be realized. But biologists have already made substantial progress using the technology to study the basic components of our bodies—and perhaps changing the nature of that study too. As in so many other scientific domains, I wrote, “the ability to explain is being replaced by the ability to predict, human discovery supplanted by algorithmic faith.”
A Virtual Cell Is a ‘Holy Grail’ of Science. It’s Getting Closer.
By Matteo Wong
The human cell is a miserable thing to study. Tens of trillions of them exist in the body, forming an enormous and intricate network that governs every disease and metabolic process. Each cell in that circuit is itself the product of an equally dense and complex interplay among genes, proteins, and other bits of profoundly small biological machinery.
Our understanding of this world is hazy and constantly in flux. As recently as a few years ago, scientists thought there were only a few hundred distinct cell types, but new technologies have revealed thousands (and that’s just the start). Experimenting in this microscopic realm can be a kind of guesswork; even success is frequently confounding. Ozempic-style drugs were thought to act on the gut, for example, but might turn out to be brain drugs, and Viagra was initially developed to treat cardiovascular disease.
What to Read Next
- Why a cognitive scientist put a head cam on his baby: “Lake hopes to one day feed the data from Luna and others back into his own models,” my colleague Sarah Zhang wrote last year, “to find better ways of training AI, and to find better ways of understanding how children pull off the ubiquitous yet remarkable feat of learning language.”
- Science is becoming less human: “For centuries, knowledge of the world has been rooted in observing and explaining it,” I wrote in 2023. “Many of today’s AI models twist this endeavor, providing answers without justifications and leading scientists to study their own algorithms as much as they study nature.”
P.S.
Earlier this week, Meta announced that it was ending its professional fact-checking program, starting with the United States. “Good riddance,” my colleague Ian Bogost wrote. Fact-checking is supposed to be a time-consuming, complicated practice that “imbues a published work with an ethos of care.” But what social-media platforms such as Facebook have implemented is surface-level, at best, and “tarnished the idea that fact-checking could be something more.”
— Matteo
The post Can Generative AI Uncover the ‘Language of Biology’? appeared first on The Atlantic.