Tuesday, December 12, 2023

Why Worry About Resources?

The attitude of the crypto-bros and tech more generally is that they are going to make so much money that paying for whatever resource they need to make it will be a drop in the ocean. Amd that externalities such as carbom emissions are someone else's problem.

I discussed Proof-of-Work's scandalous waste of energy in my EE380 talk, Can We Mitigate Cryptocurrencies' Externalities? and elsewhere, since 2017 often citing the work of Alex de Vries. Two years ago de Vries and Christian Stoll's Bitcoin's growing e-waste problem pointed out that in addition to mining rig's direct waste of power, their short economic life drove a massive e-waste problem, adding the embedded energy of the hardware to the problem.

de Vries Fig. 1
Now, de Vries' Bitcoin’s growing water footprint reveals that supporting gambling, money laundering and crime causes yet another massive waste of resources.

But that's not all. de Vries has joined a growing chorus of researchers showing that the VC's pivot to AI wastes similar massive amounts of power. Can analysis of AI's e-waste and water consumption be far behind?

Below the fold I discuss papers by de Vries and others on this issue.

First, de Vries examines the problem highlighted by Jacky Sawicky of the Texas Coalition Against Cryptomining, water usage by Riot Blockchain's massive Texas mining operation:
Corsicana will sell Riot the water. Jackie only discovered how much water Riot needed via records requests. “They asked for 1.6 million gallons a day in the height of the summer, and we are all being told to conserve,” Jackie said.

“When Riot tells people they are going to be wasting 1.6 million gallons a day, more than the iron smelt, more than the candy factory, they are going to be the number one user of water,” said Jackie.
In Bitcoin’s growing water footprint Alex de Vries discovers some very large water losses:
Amid growing concerns over the impacts of climate change on worldwide water security, Bitcoin’s water footprint has rapidly escalated in recent years. The water footprint of Bitcoin in 2021 significantly increased by 166% compared with 2020, from 591.2 to 1,573.7 GL. The water footprint per transaction processed on the Bitcoin blockchain for those years amounted to 5,231 and 16,279 L, respectively. As of 2023, Bitcoin’s annual water footprint may equal 2,237 GL.
That is, each Bitcoin transaction consumes a small swimming pool worth of water.

de Vries computes two kinds of losses:
The first involves onsite (direct) water use for cooling systems and air humidification. Water usage depends on cooling system types and local climate conditions. It is important to differentiate between water withdrawal and water consumption in terms of this usage. Water withdrawal pertains to the water taken from surface water or groundwater sources, while water consumption refers to the portion of water that becomes unavailable for reuse after withdrawal, primarily due to evaporation in cooling systems. Water consumption is not extensively studied in Bitcoin mining or generic data center research, as reliable data on water consumption factors are challenging to obtain.
And:
The second way in which miners use water relates to the (indirect) water consumption associated with generating the electricity necessary to power their devices. Thermoelectric power generation plays a major role in water consumption, as a portion of the withdrawn water for cooling purposes evaporates (unless dry cooling technologies utilizing air are employed).
Note some caveats (my emphasis):
These systems can utilize both freshwater and non-freshwater sources. This commentary, however, exclusively focuses on freshwater consumption. ... The total water footprint of Bitcoin examined in this commentary encompasses the freshwater consumed due to both direct and indirect water consumption during the operational stage of Bitcoin mining devices.
It is important to note that water consumed is not the only concern, withdrawn water can be too. Sawicky points out that:
The water gets quite hot — over 100 degrees Fahrenheit — and that hot water needs to go somewhere. Where will it go?

Jackie hasn’t been able to get a clear answer, but she has concerns. “The property Riot purchased here in Navarro Country feeds into Richland Creek and Richland Creek feeds into Richland-Chambers Reservoir, and that’s the tap water for Arlington-Fort Worth.”
Hot water kills fish, and any pollution from the Riot facility will end up in the water supply.

de Vries Fig. 2
The US hosts a significant part of the total Bitcoin mining fleet. The New York Times list of large US miners:
includes 34 Bitcoin mines, with the power requirement for each mine ranging from 38 to 450 megawatts (MW) as of March 2023. Together, these 34 mines are responsible for 3.91 GW of power demand, representing roughly a quarter of Bitcoin’s total estimated power demand in the same month (i.e., 16.2 GW) and a majority of the share that can be attributed to the US according to the CCAF (i.e., 37.84% as of January 2022).
de Vries estimates that:
This means the total water footprint of US Bitcoin miners, after adding their direct water consumption, could be an annual 93–120 GL, which is 10%–41% more than the estimated indirect water consumption of 84.9 GL per year. It also means the total water footprint of US Bitcoin miners could be equivalent to the average annual water consumption of around 300,000 US households, comparable with a city such as Washington, DC.
Source
As I write, the Bitcoin blockchain is processing 498,790 transactions/day using 16.84GW. Igor Makarov and Antoinette Schoar write:
90% of transaction volume on the Bitcoin blockchain is not tied to economically meaningful activities but is the byproduct of the Bitcoin protocol design as well as the preference of many participants for anonymity.
Source
Thus only about 10% are "economically meaningful" transactions between individuals and exchanges (about 2080/hour), so the average economically meaningful transaction consumes 4MWh. It generates an average of a quarter of a MacBook Air of e-waste, and uses 160K liters of water.

For comparison, our house uses 4.3MWh/year, generates perhaps 2 MacBook Airs of e-waste per year, and consumes 223K liters of water per year. So each economically meaningful Bitcoin transaction consumes almost our annual electricity usage, about 6 weeks of our e-waste generation, and about 70% of our annual water usage.

de Vries then follows the current hype cycle and pivots to AI. In The growing energy footprint of artificial intelligence he starts with the notoriously compute-intensive training phase:
Hugging Face reported that its BigScience Large Open-Science Open-Access Multilingual (BLOOM) model consumed 433 MWh of electricity during training.4 Other LLMs, including GPT-3, Gopher and Open Pre-trained Transformer (OPT), reportedly used 1,287, 1,066, and 324 MWh, respectively, for training. Each of these LLMs, was trained on terabytes of data and has 175 billion or more parameters.
de Vries then turns to the less-studied inference phase:
Research firm SemiAnalysis suggested that OpenAI required 3,617 of NVIDIA’s HGX A100 servers, with a total of 28,936 graphics processing units (GPUs), to support ChatGPT, implying an energy demand of 564 MWh per day. Compared to the estimated 1,287 MWh used in GPT-3’s training phase, the inference phase’s energy demand appears considerably higher. Furthermore, Google reported that 60% of AI-related energy consumption from 2019 to 2021 stemmed from inference.
He estimates for Google the energy needed per inference in LLMs:
SemiAnalysis estimated that implementing AI similar to ChatGPT in each Google search would require 512,821 of NVIDIA’s A100 HGX servers, totaling 4,102,568 GPUs. At a power demand of 6.5 kW per server, this would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh. New Street Research independently arrived at similar estimates, suggesting that Google would need approximately 400,000 servers, which would lead to a daily consumption of 62.4 GWh and an annual consumption of 22.8 TWh. With Google currently processing up to 9 billion searches daily, these scenarios would average to an energy consumption of 6.9–8.9 Wh per request. This estimate aligns with Hugging Face’s BLOOM model, which consumed 914 kWh of electricity for 230,768 requests,4 averaging to 3.96 Wh per request.
...
Alphabet’s chairman indicated in February 2023 that interacting with an LLM could ‘‘likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction.
And similarly for ChatGPT:
SemiAnalysis’ assessment of ChatGPT’s operating costs in early 2023, which estimated that ChatGPT responds to 195 million requests per day, requiring an estimated average electricity consumption of 564 MWh per day, or, at most, 2.9 Wh per request.
de Vries Fig. 1
de Vries considers a worst-case scenario in which every search involved interacting with an LLM:
In 2021, Google’s total electricity consumption was 18.3 TWh, with AI accounting for 10%–15% of this total. The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year),
Fortunately, the hardware supply chain can't deliver the chips needed to do this, and even if it could Google would not want to make the roughly $100B investment. A more realistic scenario is based on Nvidia's projected sales of 100,000 AI servers in 2023:
these servers would have a combined power demand of 650–1,020 MW. On an annual basis, these servers could consume up to 5.7–8.9 TWh of electricity. Compared to the historical estimated annual electricity consumption of data centers, which was 205 TWh, this is almost negligible.
Note that this estimate assumes 100% duty cycle but ignores the power needed to cool the servers, two effects which tend to cancel each other. It also ignores the embedded energy in the servers, and their eventual contribution to the e-waste problem. de Vries concludes:
While the exact future of AI-related electricity consumption remains difficult to predict, the scenarios discussed in this commentary underscore the importance of tempering both overly optimistic and overly pessimistic expectations. Integrating AI into applications such as Google Search can significantly boost the electricity consumption of these applications. However, various resource factors are likely to restrain the growth of global AIrelated electricity consumption in the near term. Simultaneously, it is probably too optimistic to expect that improvements in hardware and software efficiencies will fully offset any longterm changes in AI-related electricity consumption. These advancements can trigger a rebound effect whereby increasing efficiency leads to increased demand for AI, escalating rather than reducing total resource use.
Luccioni Fig. 1
de Vries' pivot to the cost of AI inference is joined by Alexandra Sasha Luccioni, Yacine Jernite and Emma Strubell with Power Hungry Processing: Watts Driving the Cost of AI Deployment? who:
propose the first systematic comparison of the ongoing inference cost of various categories of ML systems, covering both task-specific (i.e. finetuned models that carry out a single task) and ‘general-purpose’ models, (i.e. those trained for multiple tasks). We measure deployment cost as the amount of energy and carbon required to perform 1,000 inferences on representative benchmark dataset using these models. We find that multi-purpose, generative architectures are orders of magnitude more expensive than task-specific systems for a variety of tasks, even when controlling for the number of model parameters.
As with de Vries, they justify their focus on inference:
According to AWS, the largest global cloud provider, inference is estimated to make up 80 to 90% of total ML cloud computing demand [2, 28], whereas a 2021 publication by Meta attributed approximately one-third of their internal end-to-end ML carbon footprint to model inference, with the remainder produced by data management, storage, and training [56]; similarly, a 2022 study from Google attributed 60% of its ML energy use to inference, compared to 40% for training [40].
Luccioni et al's Figure 1 (note Y axis logarithmic) shows that the range of carbon emissions for the various tasks they tested spans 3 orders of magnitude. Generative tasks are by far more expensive, with image generation particularly costly.

Luccioni Fig. 2
They report that although increasing model size increases emissions, the effect is less than that of the task:
We do observe a relationship between model size and quantity of emissions produced during inference, with differing progressions for each modality – however, the task structure accounts for more of the variation than the model size does. We can observe once again that text-to-image is by far the most carbon- and energy-intensive task, with smaller image generation models such as segmind/tiny-sd that have around 500M parameters producing magnitudes more carbon than text-to-category models (100g vs. 0.6 g of CO 2 per 1,000 inferences).
...
For context, the most carbon-intensive image generation model (stable-diffusion-xl-base-1.0) generates 1,594 grams of CO 2 for 1,000 inferences, which is roughly the equivalent to 4.1 miles driven by an average gasoline-powered passenger vehicle.
They conclude:
Using multi-purpose models for discriminative tasks is more energy-intensive compared to task-specific models for these same tasks. This is especially the case for text classification (on IMDB, SST 2 and Rotten Tomatoes) and question answering (on SciQ, SQuAD v1 and v2), where the gap between task-specific and zero-shot models is particularly large, and less so for summarization (for CNN-Daily Mail, SamSUM and XSum). As can be seen in Table 4, the difference between multi-purpose models and task-specific models is amplified as the length of output gets longer.
...
While we see the benefit of deploying generative zero-shot models given their ability to carry out multiple tasks, we do not see convincing evidence for the necessity of their deployment in contexts where tasks are well-defined, for instance web search and navigation, given these models’ energy requirements.
One of the most-hyped applications of AI is autonomous vehicles, and Soumya Sudhakar, Vivienne Sze and Sertac Karaman examine their externalities in Data Centers on Wheels: Emissions From Computing Onboard Autonomous Vehicles which is summarized by Brandon Vigliarolo in Self-driving car computers may be 'as bad' for emissions as datacenters. Sudhakar et al use probabilistic models to estimate the power needed for autonomous navigation:
Based on current trends, a widespread AV adoption scenario where approximately 95% of all vehicles are autonomous requires computer power to be less than 1.2 kW for emissions from computing on AVs to be less than emissions from all data centers in 2018 in 90% of modeled scenarios. Anticipating a future scenario with high adoption of AVs, business-as-usual decarbonization, and workloads doubling every three years, hardware efficiency must double every 1.1 years for emissions in 2050 to equal 2018 data center emissions. The rate of increase in hardware efficiency needed in many scenarios to contain emissions is faster than the current rate.
Again, note that in aggregate autonomous vehicles will consume far more power in inference than in training.

Sudhakar et al Fig. 1
History shows that the efficiency of the on-board computers in autonomous vehicles has doubled every 2.8 years. Their models, using plausible assumptions as to parameters such as the adoption rate of the technology, and the growth of the navigation workload, show that unless the hardware efficiency increases much faster emissions from autonomous vehicles will rapidly eclipse the total from all data centers in 2018. For example, their Figure 1 shows:
Emissions from computing onboard AVs driving 1 h/day. With one billion AVs, an average computer power of 0.84 kW yields emissions equal to emissions of all data centers.
In all these cases we see the industry's total lack of concern for externalities such as carbon emissions, grid stability, e-waste and water consumption. In cryptocurrency's case we can add crime and money laundering. In the case of autonomous vehicles we can add death, injury and traffic congestion. All these papers propose ways to reduce the externalities they document, but in almost all cases my reaction is "good luck with that!".

5 comments:

Geoff said...

Excellent analysis; thank you.

Tardigrade said...

Are computer drivers expected to be less, equal, or more power-train efficient from start to destination than the average human driver? Possibly factoring in average accident rates and accident energy costs (such as repairs).

This is the huge question with respect to autonomous driving power efficiency. Comparing to data centers is basically an apples-to-oranges comparison. Informational for certain purposes, but likely not the primary factor that needs to be considered.

David. said...

David Pan's US Bitcoin Miners Use as Much Electricity as Everyone in Utah reports that:

"Bitcoin miners in the US are consuming the same amount of electricity as the entire state of Utah, among others, according to a new analysis by the US Energy Information Administration. And that’s considered the low end of the range of use.

Electricity usage from mining operations represents 0.6% to 2.3% of all the country’s demand in 2023, according to the report released Thursday. It is the first time EIA has shared an estimate. The mining activity has generated mounting concerns from policymakers and electric grid planners about straining the grid during periods of peak demand, energy costs and energy-related carbon dioxide emissions."

David. said...

Brandon Vigliarolo's US starts 'emergency' checks on cryptocurrency power use, citing winter power demands reports that:

"the Energy Information Administration, part of the US Department of Energy, has been granted funding for a six-month study into cryptocurrency energy use, which will involve collecting and analyzing grid utilization data from scores of mining operations.

The EIA justified [PDF] the emergency nature of its investigation by arguing an ongoing bitter cold snap in the country along with a recent spike in the price of Bitcoin, and thus demand for the digital money, could cause an unnecessary drag on the US power grid this year and push up people's bills.
...
The EIA told us it hopes to develop a base snapshot of cryptomining companies and their energy usage, quantify how much energy usage by identified miners fluctuates, pinpoint energy sources, and identify regions where cryptomining is concentrated."

David. said...

Eamon Farhat reports that Electricity Demand at Data Centers Seen Doubling in Three Years:

"Global electricity demand from data centers, cryptocurrencies and artificial intelligence could more than double over the next three years, adding the equivalent of Germany’s entire power needs, the International Energy Agency forecasts in its latest report.

There are more than 8,000 data centers globally, with about 33% in the US, 16% in Europe and close to 10% in China, with more planned. In Ireland, where data centers are developing rapidly, the IEA expects the sector to consume 32% of the country’s total electricity by 2026 compared to 17% in 2022. Ireland currently has 82 centers; 14 are under construction and 40 more are approved.

Overall global electricity demand is expected to see a 3.4% increase until 2026, the report found. The increase, however, will be more than covered by renewables, such as wind, solar and hydro, and all-time high nuclear power."