Access Business Analytics
In the streaming adaptation of Isaac Asimov’s Foundation sci-fi novels, Synnax became a water planet due to overmining of its volcanic vents. In the film Waterworld, the melting of the polar ice caps submerged most of planet Earth. In both cases, as sea levels rose, dry land came to be in ever shorter supply – as did food, potable water, biodiversity, and economic activity other than subsistence fishing and piracy. People migrated in search of life essentials or safer havens. Living as I do for months at a time within 200 meters of the Atlantic Ocean, these films hit home – I watched both as cautionary stories about climate change set within big adventure tales.
Planets don’t get inundated just in sci-fi films – find your own location on a coastal risk map to check your local risk profile. But what do rising ocean levels have to do with language? Populations in coastal, low-lying areas such as flood plains, and on small islands are displaced – and may become refugees. These involuntary migrants with fewer resources in less developed countries tend to include indigenous peoples (also called aboriginal, native, autochthonous, or my favorite descriptor, First Nations, in Canada).
Finally, it’s not only about the possibility of language extinction but the blunt reality of decreasing vitality – that is, as usage decays, language plays less and less of a role in daily life until it just fades away. The Ethnos Project developed the Ethnologue Expanded Graded Intergenerational Disruption Scale (EGIDS) with a 10-point range from vigorous international use at the top to the last speaker’s demise and the subsequent extinction of the language. Several intermediate states – threatened, shifting, moribund, nearly extinct, and dormant – chart the passage from a thriving to a dead language.
Two things popped up on one of my screens recently. An LSP posted the carbon spend (in kilograms of C02) of delivering various linguistic services. Then Skype suggested that I use its DALL-E feature to draw a superhero – which I did, specifying my imagined crusader to look something like an ikran, the aerial predator from Avatar. Skype kept offering to draw things for me (today? “Design a dress for me”) until I realized that each prompt threw another dollop of carbon dioxide into the atmosphere – all for giggles, not like that LSP matter-of-factly informing me the amount of CO2 an interpreting job would generate. Instead, my ikran-hero added an infinitesimally tiny amount of CO2 that that could lead to rising temperatures, melting ice caps, rising seas, and ultimately to a Synnax or Waterworld inundation.
Much has been written about climate change due to the fossil-fueled industrialization starting in the late 1700s. Today that discussion includes the impact of information technology on the climate – and more recently discussion of the environmental, social, and economic impact of machine translation (MT) and artificial intelligence (AI). There’s also been a debate in some circles as to whether any of the problems associated with climate change have anything to do with people. Whatever the cause, the rise in temperatures co-occurred with the use of coal, oil, and natural gas to power our world. And the data shows that my DALLE-E superhero adds even more carbon to the heating up climate along with my daily business and personal use of gigabit FIOS, ChatGPT, Google MT, iCloud, Egnyte, Amazon, et al.
Just how much carbon dioxide does my daily use of various information and communication technologies contribute? Since my CO2 sensor is backordered at Amazon and won’t be delivered by a gas-powered MB Sprinter for another week, let’s use some published numbers comparing the carbon footprint of a typical American (me) versus the emissions from training a large language model. There are billions of humans on the planet, none so profligate in energy usage as me and my fellow Americans.
While individual human activity emits a lot of CO2, each LLM in turn creates a lot of its own – and many business and government agencies plan to build their own models – or have them built for them. Each one of them spews out tons of greenhouse gases. And as forecasted use of generative AI grows, there will be more C02 to crank up the global thermostat.
Let’s not forget the other abbreviations. ICT (information and communication technology) in its various forms could account for more than 20% of global energy use by 2025 – that’s the same ground-zero deadline for turning the corner on carbon emissions. Machine translation, starting with its statistical variants two decades ago, has been responsible for a growing amount of usage, under the banner of ICT. Its latest instance, neural MT is on the scale of GenAI in terms of its energy consumption, use of cooling water (H20), and C02 output.
Independent data analysts tell us that these technologies consume a lot of energy and excrete a lot of carbon dioxide. What sayeth The Source? I asked ChatGPT-4 about its C02 emissions. It responded that, “The carbon footprint of an individual inference is likely relatively low. This is because once a model is trained, the energy costs to use the model ([that is,] run an inference) are substantially lower than the energy costs to train it.” As for the large language model itself, that’s a different story: “Training models like GPT-3 or GPT-4 are energy-intensive due to the sheer scale of computations involved. The carbon footprint largely depends on the source of the energy used for training. If a data center uses renewable energy, the carbon footprint will be lower than one that relies on fossil fuels.
And therein lies part of the solution. In our research on the ethics of generative AI, we observed that organizations have adopted frameworks evaluating their actions and investments from the environmental, social, and governance (ESG) perspectives.
It’s time for LLM consumers to lean on their development teams, technology, and service suppliers and assert a basic ESG tenet – “don’t wreck your planet” – that means both buyers and suppliers of ICT, MT, AI, whatever should live by the dictum of “one egg, one basket” – we don’t have the luxury of breaking multiple eggs.
Chief Research Officer
Focuses on market trends, business models, and business strategy
It Depends As your organization pivots toward integrating generative AI (GenAI) into more of its ...
In October 2023, we argued that the future of AI would be in “focused large language models” (FLLM...
The topic of automation has taken the interpreting industry by storm. On the one hand, enthusiasts b...
Back in the day when I first began working in localization, we didn’t have a translation management...
When friends and family hear what I’m working on these days, they typically ask: 1) won’t AI elimi...
Some people feel that using artificial intelligence (AI) to interpret human speech is a curse becaus...
Posts by CSA_Research