In the streaming adaptation of Isaac Asimov’s Foundation sci-fi novels, Synnax became a water planet due to overmining of its volcanic vents. In the film Waterworld, the melting of the polar ice caps submerged most of planet Earth. In both cases, as sea levels rose, dry land came to be in ever shorter supply – as did food, potable water, biodiversity, and economic activity other than subsistence fishing and piracy. People migrated in search of life essentials or safer havens. Living as I do for months at a time within 200 meters of the Atlantic Ocean, these films hit home – I watched both as cautionary stories about climate change set within big adventure tales.
Sea Rise, Migration, and Language in Real Life
Planets don’t get inundated just in sci-fi films – find your own location on a coastal risk map to check your local risk profile. But what do rising ocean levels have to do with language? Populations in coastal, low-lying areas such as flood plains, and on small islands are displaced – and may become refugees. These involuntary migrants with fewer resources in less developed countries tend to include indigenous peoples (also called aboriginal, native, autochthonous, or my favorite descriptor, First Nations, in Canada).
- Societies and cultures erode. Even if they’re all relocated to the same place (which is not always the case), refugees are likely to be assimilated into the majority population. Their culture, art, sociopolitical unity, educational systems, traditional livelihoods, property rights, and farming practices may diminish or disappear – sometimes with consequence. For example, in the optimistic “Not Too Late” collection of essays, Jade Begay cites the introduction of non-indigenous land management practices as the cause of wildfires and expanding sand deserts in the American west. The original inhabitants – native Americans – lived closer to the land than do current inhabitants and didn’t start fires they couldn’t put out.
- Let’s not forget their language – even though the refugees will over time. Indigenous languages encode the society’s ethos, history, traditions, and relationship with the environment. As languages decay, bits of global heritage and knowledge disappear with them. As a philologist in a former life, I studied artifacts of populations and languages that didn’t survive, in the process learning about societies like Kyivan Rus’ that followed. The traditional wisdom in language studies is that a language – and the world it describes – dies every two weeks (26 per year). A statistical analysis demonstrated that it’s more on the order of nine languages per year, still a sizable number to lose. That said, author Gary Simons projects language extinction accelerating to that higher volume by 2150.
Finally, it’s not only about the possibility of language extinction but the blunt reality of decreasing vitality – that is, as usage decays, language plays less and less of a role in daily life until it just fades away. The Ethnos Project developed the Ethnologue Expanded Graded Intergenerational Disruption Scale (EGIDS) with a 10-point range from vigorous international use at the top to the last speaker’s demise and the subsequent extinction of the language. Several intermediate states – threatened, shifting, moribund, nearly extinct, and dormant – chart the passage from a thriving to a dead language.
Where Do ICT, MT, AI, H20, and C02 Fit?
Two things popped up on one of my screens recently. An LSP posted the carbon spend (in kilograms of C02) of delivering various linguistic services. Then Skype suggested that I use its DALL-E feature to draw a superhero – which I did, specifying my imagined crusader to look something like an ikran, the aerial predator from Avatar. Skype kept offering to draw things for me (today? “Design a dress for me”) until I realized that each prompt threw another dollop of carbon dioxide into the atmosphere – all for giggles, not like that LSP matter-of-factly informing me the amount of CO2 an interpreting job would generate. Instead, my ikran-hero added an infinitesimally tiny amount of CO2 that that could lead to rising temperatures, melting ice caps, rising seas, and ultimately to a Synnax or Waterworld inundation.
Much has been written about climate change due to the fossil-fueled industrialization starting in the late 1700s. Today that discussion includes the impact of information technology on the climate – and more recently discussion of the environmental, social, and economic impact of machine translation (MT) and artificial intelligence (AI). There’s also been a debate in some circles as to whether any of the problems associated with climate change have anything to do with people. Whatever the cause, the rise in temperatures co-occurred with the use of coal, oil, and natural gas to power our world. And the data shows that my DALLE-E superhero adds even more carbon to the heating up climate along with my daily business and personal use of gigabit FIOS, ChatGPT, Google MT, iCloud, Egnyte, Amazon, et al.
Just how much carbon dioxide does my daily use of various information and communication technologies contribute? Since my CO2 sensor is backordered at Amazon and won’t be delivered by a gas-powered MB Sprinter for another week, let’s use some published numbers comparing the carbon footprint of a typical American (me) versus the emissions from training a large language model. There are billions of humans on the planet, none so profligate in energy usage as me and my fellow Americans.
While individual human activity emits a lot of CO2, each LLM in turn creates a lot of its own – and many business and government agencies plan to build their own models – or have them built for them. Each one of them spews out tons of greenhouse gases. And as forecasted use of generative AI grows, there will be more C02 to crank up the global thermostat.
Let’s not forget the other abbreviations. ICT (information and communication technology) in its various forms could account for more than 20% of global energy use by 2025 – that’s the same ground-zero deadline for turning the corner on carbon emissions. Machine translation, starting with its statistical variants two decades ago, has been responsible for a growing amount of usage, under the banner of ICT. Its latest instance, neural MT is on the scale of GenAI in terms of its energy consumption, use of cooling water (H20), and C02 output.
Remember Your ESG Framework
Independent data analysts tell us that these technologies consume a lot of energy and excrete a lot of carbon dioxide. What sayeth The Source? I asked ChatGPT-4 about its C02 emissions. It responded that, “The carbon footprint of an individual inference is likely relatively low. This is because once a model is trained, the energy costs to use the model ([that is,] run an inference) are substantially lower than the energy costs to train it.” As for the large language model itself, that’s a different story: “Training models like GPT-3 or GPT-4 are energy-intensive due to the sheer scale of computations involved. The carbon footprint largely depends on the source of the energy used for training. If a data center uses renewable energy, the carbon footprint will be lower than one that relies on fossil fuels.
And therein lies part of the solution. In our research on the ethics of generative AI, we observed that organizations have adopted frameworks evaluating their actions and investments from the environmental, social, and governance (ESG) perspectives.
- Smaller models take less energy to train. Generative AI developers have already begun the move toward LLMs that require fewer parameters and less time to process. Given the viral adoption of this useful form of AI, less is certainly more given the volumes in question. “Making Generative AI green” is an obvious tagline for a smart developer or consultancy.
- Renewable energy – duh. Replacing fossil fuels with renewable energy is an obvious and essential step. Apple famously bought a hydroelectric plant way back in 2014 to power a data center, Translated followed suit in 2021 as a carbon-neutral replacement for its MT data network, and Iceland has positioned its mix of geothermal and hydroelectric power generation plus free cooling from its chilly climate for other data centers not looking to get into some other dam business. However you do it, using less – or no – dirty energy to do the work means less gunk on the way out. As for positioning, renewable energy for planet Earth addresses economic and environmental imperatives.
It’s time for LLM consumers to lean on their development teams, technology, and service suppliers and assert a basic ESG tenet – “don’t wreck your planet” – that means both buyers and suppliers of ICT, MT, AI, whatever should live by the dictum of “one egg, one basket” – we don’t have the luxury of breaking multiple eggs.