A year ago, a 1,200-acre stretch of farmland outside New Carlisle, Ind., was an empty cornfield. Now, seven Amazon data centers rise up from the rich soil, each larger than a football stadium.
Over the next several years, Amazon plans to build around 30 data centers at the site, packed with hundreds of thousands of specialized computer chips. With hundreds of thousands of miles of fiber connecting every chip and computer together, the entire complex will form one giant machine intended just for artificial intelligence.
The facility will consume 2.2 gigawatts of electricity — enough to power a million homes. Each year, it will use millions of gallons of water to keep the chips from overheating. And it was built with a single customer in mind: the A.I. start-up Anthropic, which aims to create an A.I. system that matches the human brain.
The complex — so large that it can be viewed completely only from high in the sky — is the first in a new generation of data centers being built by Amazon, and part of what the company calls Project Rainier, after the mountain that looms near its Seattle headquarters. Project Rainier will also include facilities in Mississippi and possibly other locations, like North Carolina and Pennsylvania.
Project Rainier is Amazon’s entry into a race by the technology industry to build data centers so large they would have been considered absurd just a few years ago. Meta, which owns Facebook, Instagram and WhatsApp, is building a two-gigawatt data center in Louisiana. OpenAI is erecting a 1.2-gigawatt facility in Texas and another, nearly as large, in the United Arab Emirates.
These data centers will dwarf most of today’s, which were built before OpenAI’s ChatGPT chatbot inspired the A.I. boom in 2022. The tech industry’s increasingly powerful A.I. technologies require massive networks of specialized computer chips — and hundreds of billions of dollars to build the data centers that house those chips. The result: behemoths that stretch the limits of the electrical grid and change the way the world thinks about computers.
Amazon, which has invested $8 billion in Anthropic, will rent computing power from the new facility to its start-up partner. An Anthropic co-founder, Tom Brown, who oversees the company’s work with Amazon on its chips and data centers, said having all that computing power in one spot could allow the start-up to train a single A.I. system.
“If you want to do one big run, you can do it,” he said.
Amazon has been working on the technology used in this complex for almost a decade. In 2015, the tech giant acquired an Israeli chip designer, Annapurna Labs. A year later, at a lab in Austin, Texas, Annapurna — which continued to operate as a largely independent team of engineers — began designing the company’s first computer chip dedicated to A.I.
This initial chip, called Inferentia, was not widely used. But developing a viable computer chip requires years of design. Annapurna Labs developed its latest new chip, Trainium 2, alongside engineers at Anthropic. They tailored it for a massive complex like the one in New Carlisle.
“It’s a journey,” said Gadi Hutt, senior director of customer and product engineering at Annapurna Labs.
The Amazon chips are not as elaborate or as powerful as the latest chips from Nvidia, the Silicon Valley chipmaker that dominates A.I. work. But Amazon believes that by packing twice as many of these simpler chips into each data center, it can provide more computing power using the same amount of electricity.
“If we provide the performance that our customers want,” Mr. Hutt said, “then why choose to make a lot of exotic engineering choices that will just slow us down and cause delays?”
Amazon, which has been building data centers for more than 18 years to run its online retail business and to rent computing services to other businesses, has accelerated its data center expansion for work on A.I. “pretty significantly,” said Prasad Kalyanaraman, an Amazon vice president, standing in a construction trailer at the site in Indiana.
Just a few months after OpenAI released ChatGPT in late 2022, Amazon was in talks with electrical utilities to find a site for its A.I. ambitions. In Indiana, a subsidiary of American Electric Power, or AEP, suggested that Amazon tour tracts of farmland 15 miles west of South Bend that had been rezoned into an industrial center. By the end of May 2023, more than a dozen Amazon employees had visited the site.
By early 2024, Amazon owned the land, which was still made up of corn and soybean fields. Indiana’s legislature approved a 50-year sales tax break for the company, which could ultimately be worth around $4 billion, according to the Citizens Action Coalition, a consumer and environmental advocacy organization. Separate property and technology tax breaks granted by the county could save Amazon an additional $4 billion over the next 35 years.
The exact cost of developing the data center complex is not clear. In the tax deal, Amazon promised $11 billion to build 16 buildings, but now it plans to build almost twice that. The total number of buildings is not determined yet and will depend in part on whether the company gets permission, over vocal community opposition, to build on a 10-acre wetland in the middle of the complex. Amazon intends to build on the wetland, pointing out that it is a small, shallow wetland, not a major nature preserve.
To complete construction as fast as possible, Amazon hired four general contractors to work simultaneously.
“I don’t know if they’re competing for cash or steak dinner or what, but it’s crazy how much they’re getting up,” said Bill Schalliol, a county economic development official. “Steel starts to go up here, the next day steel’s going up over there.”
On a typical week, about 4,000 workers are on site, Mr. Schalliol said. Local hotels have been full, and there has been such an uptick in congestion and traffic accidents that Amazon agreed to pay $120,000 to cover overtime for traffic enforcement and an additional $7 million for road improvements.
To bury the fiber optic cables connecting the buildings and to install other underground infrastructure, Amazon had to pump water out of the wet ground. One permit application showed that the company requested permission to pump 2.2 million gallons an hour, for 730 days. State officials are now investigating if the process, known as dewatering, is the reason some neighbors are reporting dry wells.
Some locals have protested the way the project has progressed, complaining that it has caused water problems, increased traffic and noise and significantly altered the look and feel of this agricultural community, and that it could ruin the small natural wetland in the middle of the complex.
“You can see the mountain of dirt they are ready to shove on those wetlands,” said Dan Caruso, a retired mail carrier who lives in New Carlisle, pointing to a cluster of tall trees next to a newly plowed parking lot at the construction site. “Wildlife depends on those wetlands.”
By early June, seven buildings had been constructed and bulldozers were moving dirt on the site of an eighth. What is currently being used as a parking lot will soon become the ninth.
Amazon’s approach differs from that of Google, Microsoft and Meta, companies that are packing far more powerful chips into their data centers and relying on more energy-intensive techniques to cool the chips down. Because Amazon is using a significantly smaller chip, the company can cool its new complex in simpler ways. It pumps air from outside the buildings through handlers the size of cargo containers and in hot months uses municipal water to cool the air.
The approach is more efficient, according to Mr. Kalyanaraman, so the company can use more of the available electricity to run its A.I. chips.
AEP has told regulators that new, large-scale data centers will more than double the amount of peak power it must provide Indiana, from about 2.8 gigawatts in 2024 to more than seven gigawatts by approximately 2030. Amazon’s campus alone accounts for about half of the additional load growth.
“It will be the largest power user in the state of Indiana by a country mile,” said Ben Inskeep of the Citizens Action Coalition.
The utility told regulators in April that it expected to use natural gas plants to provide about three-quarters of the additional power that would be needed by 2030.
As Amazon expands this massive facility, some experts are beginning to ask if the rapid progress of A.I. over the last several years will soon hit a wall. Some studies show that progress is slowing. But Mr. Kalyanaraman said this was not a risk.
Anthropic plans to train — essentially build — A.I. systems with this giant complex. But Mr. Kalyanaraman said that if training became significantly more efficient or if A.I. development hit a wall, it could also be used to deliver A.I. technologies to customers.
“The amount of infrastructure we’re building here is so much that if you dedicate it just for training, it’s not efficient,” he said. “We expect to use these same clusters for multiple needs.”
Karen Weise writes about technology for The Times and is based in Seattle. Her coverage focuses on Amazon and Microsoft, two of the most powerful companies in America.
Cade Metz is a Times reporter who writes about artificial intelligence, driverless cars, robotics, virtual reality and other emerging areas of technology.
The post At Amazon’s Biggest Data Center, Everything Is Supersized for A.I. appeared first on New York Times.