CleanTechnica
November 6, 2024
The Great AI/Data Center Scam
AI is the word on everyone’s lips these days. Oh, the wonders we will see! According to the University of Cincinnati, “AI is the branch of computer science dedicated to creating machines that are able to think like humans. It’s about taking the unique abilities of the human brain to understand, react, and interpret and incorporating those abilities into computers and other devices.” Ever since ChatGPT became publicly available in 2022, the demand for more and larger data centers has exploded. An AI internet search uses ten times as much electrical energy as a conventional search. The implications of that should be evident to everyone.
The demand for electricity is creating fault lines in society, with many warning that Big Data companies like Google, Microsoft, Meta, Amazon, Tesla, and others will happily ask utility customers to pay for new generating capacity and transmission upgrades so they can stuff hundreds of billions of dollars into their corporate pockets. That has sparked a number of contentious policy disputes in many US states. Recently, Ohio proposed new energy tariffs that would require data center operators to pay 85% of their expected demand for electricity over the next 20 years in order to make sure they pay for those costs, not residential and small business utility subscribers.
In Oregon, electric utilities are warning regulators that consumers need protections from rising rates caused by data centers, according to a report by the Washington Post. In Virginia, Ohio, and South Carolina, companies are battling over the extent of their responsibility for increases, attempting to fend off anger from customers. In the Mid-Atlantic region, data centers are being blamed for a sharp rise in energy costs. Further rate increases of up to 20 percent are expected in 2025. “A lot of governors and local political leaders who wanted economic growth and vitality from these data centers are now realizing it can come at a cost of increased consumer bills,” said Neil Chatterjee, former chair of the Federal Energy Regulatory Commission (FERC). Tech companies and several of the utility companies serving them strongly deny they are burdening others. They claim the higher utility bills are paying for overdue improvements to the power grid that benefit all customers.
PJM Struggles With Demand For AI & Data Centers
PJM Interconnection is the grid operator that serves 13 Mid-Atlantic states and Washington, D.C. Its most recent auction to secure power for the grid during periods of extreme weather and high demand saw an 800 percent jump in the price that its member utilities have to pay. The impact will be felt by millions by the spring according to the Washington Post. Utility bills will increase by as much as 20 percent for customers of a dozen utilities in Maryland, Ohio, Pennsylvania, New Jersey, and West Virginia, regulatory filings show. In the Baltimore area, annual bills will go up by as much as $192 on average, said Maryland People’s Counsel David Lapp, a state appointee who monitors utilities. The next auction, in 2025, could be more painful, Lapp said, leaving customers potentially “looking at increases of as much as $40 to $50 a month.”
PJM Interconnection faces supply shortages that have made power more costly. Older power plants are going out of service faster than new generation is coming available. Reliability issues at large gas and coal plants during extreme weather events have also driven up prices. But the sudden, unprecedented demand of data centers, some of which can consume an entire city’s worth of power, compounds the effects of all those problems, experts say. In Virginia, which has aggressively recruited data center development, new data centers are projected to increase demand for power by as much as 50 percent by 2030, according to the consulting firm Aurora Energy Research. Over the next 15 years, the state will need to add electricity supply equal to the amount used by the entire state of New Jersey, Aurora found.
The most recent forecast from Virginia’s biggest utility, Dominion Energy, projects that between now and 2035, residential electricity prices will grow at three times the annual rate they did over the last 16 years. Dominion executives say customer bills in the state are still lower than the national average and the proposed cost increases for the coming decade are consistent with recent inflation rates. Actually, inflation is cooling, but nothing must be allowed to stand in the way of profits, and if blaming inflation makes that happen, where’s the harm? It’s just bidness, after all. But the Virginia State Corporation Commission warned in a recent advisory that demand for power from data centers is “creating issues and risks for electric utilities and their customers that have not heretofore been encountered.” The commission, which regulates utilities, is meeting this week to consider potential ratepayer protection options.
Cutting Deals For Data Centers
Advocates cite another source of cost-shifting onto consumers — discounted rates that power companies and local government officials use to entice tech companies to build data centers. The South Carolina Small Business Chamber of Commerce is begging that state’s regulators to rethink discounts and other subsidies government and utility leaders used to draw a Google data center in the southeastern part of the state. “The power drain of companies like Google is enormous,” said Buddy Delaney, whose family has been manufacturing and selling custom mattresses in the Columbia, South Carolina, area for 96 years. “We don’t think small businesses like ours should be subsidizing special electricity rates for these companies that have billions of dollars in revenue.”
Google worked out a deal with Dominion Energy that was approved by state regulators to pay just 6 cents per kilowatt-hour for electricity — less than half of what residential customers pay. The difference added about $1,000 a month to the utility bill for Best Mattress. “Current residential ratepayers are going to pay a lot, lot more because of data centers that bring almost no employees,” Chip Campsen, a Republican state senator, said when the Google deal emerged at a legislative hearing in September. “They are going to pay because they have to participate in paying the capital costs for building the generating capacity for these massive users of energy.”
Google’s head of data center energy, Amanda Peterson Corio, said in a statement that the company is “working closely with our utility partners in all the communities where we operate to ensure our growth does not impact existing ratepayers.” She said Google’s energy supply contracts undergo “rigorous review” by utility regulators “to ensure that Google covers the utility’s cost to serve us.” Dominion said in a statement that its contract with Google “covers the investments required to serve the project, including transmission lines and other facilities,” and it includes “terms to ensure other customers, such as residential and small businesses, do not unfairly incur additional costs.” That is a blatant lie. Google is getting electricity for half what other customers are paying. That is unfair and unjust.
Recently, Microsoft hatched a plan to recommission the shuttered nuclear reactor at Three Mile Island to power some of its data centers. That plan was put on hold recently by state regulators who cited potential impact on consumers among their concerns. The plan threatened to stick ratepayers with a bill of $50 million to $140 million, according to testimony from AEP and utility conglomerate Exelon. Data center operators are desperate for electricity from any source. The newsroom at CleanTechnica got a press release last week touting new modular nuclear power stations as a solution for powering data centers. Older coal and methane-fired generating stations are being kept in operation after their scheduled end of service date, which is clearly at odds with state and federal emissions targets. Near Memphis, Newsweek reports, Elon Musk’s new AI data center — modestly known as Colossus — is using methane-powered turbines to supply it with electricity while upgrades to the local utility grid are completed, much to the consternation of city residents who have to breathe the crud Musk is putting into the air.
Apple Weighs AI In The Balance And Finds It Wanting
All of this to usher in the new era of so-called artificial intelligence. Recently, engineers at Apple explored the capabilities of the large language models that algorithms use to process AI tasks. They concluded there is a lot less to AI than its proponents claim. Their tests revealed that slight changes in the wording of inquiries can result in significantly different answers, undermining the reliability of the models. The group investigated the “fragility” of mathematical reasoning by adding contextual information to their queries that a human could understand, but which should not affect the fundamental mathematics of the solution. This resulted in varying answers, which shouldn’t happen, reports Apple Insider.
“Specifically, the performance of all models declines [even] when only the numerical values in the question are altered in the GSM-Symbolic benchmark,” the group wrote in their report. “Furthermore, the fragility of mathematical reasoning in these models [demonstrates] that their performance significantly deteriorates as the number of clauses in a question increases.” The study found that adding even a single sentence that appears to offer relevant information to a given math question can reduce the accuracy of the final answer by up to 65 percent. “There is just no way you can build reliable agents on this foundation, where changing a word or two in irrelevant ways or adding a few bits of irrelevant info can give you a different answer,” the study concluded.
An example that illustrates the issue was a math problem that required genuine understanding of the question. The task the team developed, called “GSM-NoOp,” was similar to the kind of mathematical word problems an elementary student might encounter. It started with the information needed to formulate a result. “Oliver picks 44 kiwis on Friday. Then he picks 58 kiwis on Saturday. On Sunday, he picks double the number of kiwis he did on Friday.” The query then added a clause that appears relevant, but actually isn’t with regards to the final answer. It said that of the kiwis picked on Sunday, “five of them were a bit smaller than average.”
The answer requested simply asked “how many kiwis does Oliver have?” The note about the size of some of the kiwis picked on Sunday should have no bearing on the total number of kiwis picked. However, OpenAI’s model as well as Meta’s Llama3-8b subtracted the five smaller kiwis from the total result. “We found no evidence of formal reasoning in language models,” the new study concluded. The behavior of LLMS “is better explained by sophisticated pattern matching,” which the study found to be “so fragile, in fact, that [simply] changing names can alter results.”
For this, we are pumping billions more tons of greenhouse gas emissions into the atmosphere and forcing ordinary people to pay higher utility bills? How insane is that? AI is like the shiny object in Lord of the Rings. It’s the latest “must have” accessory for a full and happy life, apparently. It proponents claim it will generate trillions of dollars in economic benefits, but many are skeptical of that claim. If it was announced that charging electric vehicles would double demand for electricity, the screams of protest could be heard all the way to the dark side of the moon, but because it involves a shiny new toy, there is hardly a peep of protest other than a few malcontents around Memphis and Buddy Delaney in South Carolina. They can go whistle for all the good their complaints will do. AI is coming whether we need it or not. We don’t need AI to know that.
https://cleantechnica.com/2024/11/06/the-great-ai-data-center-scam/