Our Analysts' Insights

Blogs & Events / Blog

A recent examination of how computing power has changed over time calculated how much it would cost today to rent all of the computer capacity that existed at various points in the past. Up until 1968 all the worlds computers’ combined capabilities fell short of what a single server could do in 2018. It was not until 1974 that the then-current capacity (roughly equal to 0.43 teraflops) would exceed £100/month to lease in today’s costs. More recently, Microsoft announced that the XBOX Series X, a home gaming console, would have a 12 teraflop capacity – almost on par with world computing power in 1981 – for US$499 (compared to roughly US$6,000 for a Mac Pro with equivalent computing power). This massive democratization of computing power, with the real cost of computing cycles falling exponentially over time, is arguably the biggest technology news of the twenty-first century, but what does it mean for language services?

A Brief History of TMS

Let’s look back a few years to answer this.

  • 2000: CAT tools go mainstream. In 2000, in the pre-cloud days of the internet, CAT tools were just beginning to find widespread acceptance. This technological leap arguably represents the point when translation went from a manual, paper-based task for which computers played a minor role to one in which they became central – but they were still primitive. Many CAT tools required dongles – additional bits of hardware that plugged into ports on the computer to prove that you owned the software – and their connection to other language professionals involved a lot of emails and balky connections to FTP sites. Very basic TMSes also first appeared in the late 1990s, but it took another decade for them to come into common usage.
     
  • 2010: TMSes take off. Although translation management systems had been around for more than a decade, it was in about 2010 that they really became a mainstream staple for enterprise language services and LSPs. These early TMSes were complex applications installed on on-premise servers or networked desktop machines. Unfortunately, due to the nature of these TMSes, a lot of data still dead-ended on desktops, servers, and hard drives where it remained unleveraged for purposes other than translation reuse for specific projects. 
     
  • 2015: TMS moves to the cloud. Cloud-based TMS software arrived with Smartling, Memsource, WordBee, XTM Cloud, SmartCAT, and more recently, memoQCloud and SDL’s evolving Language Cloud. Starting then, the state of the TMS art is found in cloud-based software that provides fully online and networked translator environments plus a host of additional functionality. The shift to the cloud finally meant that organizations could systematically and reliably centralize the rich data flows that they controlled but had previously failed to capture because files sat in multiple discrete silos, desktops, and servers.
     
  • 2020: The year of TMS+. The COVID-19 pandemic accelerated the shift to the cloud. Organizations that had used outdated technology – or treated new technology like it was old – suddenly found themselves forced to invest in technology in order to deal with a suddenly decentralized workforce logging in from home offices, bedrooms, and dining room tables due to social distancing requirements. Older desktop, on-premise, and server architectures could no longer handle these requirements, but cloud-based TMSes could. At the same time, microservices-based architectures under development for several years started coming online, allowing rapid deployment of additional feature sets that go beyond merely managing files and processes. Today’s TMS becomes the hub for managing data flows, providing access to MT engines, disparate terminology resources and repositories, AI and machine learning automation, and increasingly sophisticated content technologies, such as those shown below (see figure), that a few years ago were isolated silos.

Picture1

A Lot of Work Remains

This ongoing shift is far from done, but the advantages of cloud-based TMSes are numerous on a few fronts: reducing waste in the process; simplifying version management since there no more worries about whether files are the latest version; replacing mailed-in work in various locations with centralized output and data assets; and streamlining tasks and workflows. This movement to the cloud is critical as enterprise projects become ever bigger. Many organizations localize tens of millions of words per month – for example, one developer of storage systems recently reported to CSA Research that its operating system had almost one billion lines of code, any of which might have implications for localization. This scale is something that could not be managed by any number of unaided humans, especially if they were using disconnected systems. It requires the distributed, simultaneous capability of thousands of workers within a TMS augmented by automation.

And we are just at the beginning. As we discussed in a recent blog post, language services companies have tremendous unexploited resources in their translation memories and other language assets for developing small AI projects. Although we have argued that the TMS needs to simplify itself in many ways, it will become more critical for everything that language groups and companies do. TMSes will become the operating system for language companies and enterprise localization groups. TMS developers are hard at work extending their systems with new features that will increasingly integrate the functions of CAT tools, advanced NLP, machine translation, and automated content enrichment into a paradigm CSA Research has called “augmented translation.”

However, the fulfillment of this vision will depend on the ability of technology developers to agree to common specifications and frameworks for translation data (à la TAPICC), something that has been a perennial, if often quixotic, quest for localization nerds. It will also mean tackling the myriad sources of interoperability problems, ones that go beyond simple data formats, to reduce waste throughout the entire international content value chain.

All these changes mean that organizations still using ancient TMSes will find themselves compelled to update their processes and tools to take advantage of what the modern TMS has to offer – and the simplicity that comes from cloud deployment. Moving away from locked-down on-premise systems will be key to unleashing the next revolution in the language industry.

About the Author

Arle  Lommel

Arle Lommel

Senior Analyst

Focuses on language technology, artificial intelligence, translation quality, and overall economic factors impacting globalization

Related

Responsive Machine Translation: The Next Frontier for MT

Responsive Machine Translation: The Next Frontier for MT

CSA Research’s recent survey-based examinations of machine translation deployment at language servi...

Read More >
2020: A Year of Superlatives in the Language Industry

2020: A Year of Superlatives in the Language Industry

CSA Research recently released our list of the 100 largest LSPs and langtech providers, along with e...

Read More >
What Does Talking to Whales Tell Us about Machine Translation?

What Does Talking to Whales Tell Us about Machine Translation?

Recent advances in machine translation (MT) have been truly astounding. Even if claims that the tech...

Read More >
Challenges in Continuous Localization

Challenges in Continuous Localization

Our current research into continuous localization shows that the lines have begun to blur between wh...

Read More >
Building a Comprehensive View of Machine Translation’s Potential

Building a Comprehensive View of Machine Translation’s Potential

It is no secret that machine translation (MT) has gone from a relatively niche solution to seeming u...

Read More >
Augmenting Human Translator Performance

Augmenting Human Translator Performance

In the first episode of an iconic sci-fi television series, a NASA test pilot was seriously injured ...

Read More >

Subscribe

Name

Categories

Follow Us on Twitter