Getting Derailed: Why Standards Initiatives Fall Short - Our Analysts' Insights
X

Our Analysts' Insights

Blogs & Events / Blog
27Feb

Getting Derailed: Why Standards Initiatives Fall Short

February 27, 2019 | Arle Lommel | Standards | For LSPs, For Buyers, For Technology Vendors | | Return|

The Holy Grail of the language industry has been to standardize the transfer of jobs between the various tools and content management systems – and thus improve the outcomes. Linport, the latest initiative in this area, was born as the Container Project in 2011 at the final meeting of the Localization Industry Standards Initiative (LISA). Despite early promise, Linport has yet to make major inroads into the language industry. Other prospective standards, such as Translation Web Services from OASIS way back in 2002, failed to gain traction despite significant effort and solid technical outcomes. Meanwhile, some initiatives that actually became standards, such as Translation Memory eXchange (TMX), have languished in early versions for years, despite efforts to update and improve them. So why do these efforts fall short of their promise and how can the language industry improve upon them?

To address the question of how to improve translation outcomes, we have to identify the reasons why initiatives struggle to gain traction or even be completed. Every industry specification faces one or more of these six problems as it wends its way through conference discussions and standards bodies:

  • Complexity. Many initiatives set out to solve a straightforward problem, such as how to exchange translation memory data or evaluate quality. As with many development projects, these efforts often grow in scope as various parties seeks to address their specific edge cases and adapt the resulting standard to their business needs rather than adjust their operations to reflect the resulting guidelines. This one-plus complexity delays delivery of the final standard and makes it more difficult for implementers to follow it. In other cases, the results include support for individual use cases that nobody else follows. This problem has hampered uptake of the TermBase eXchange (TBX) standard, but the narrower TBX Basic specification from Terminorgs has seen more uptake, and the forthcoming release of a new version of TBX aims to simplify implementation.
     
  • Competing agendas. Sometimes standards end up crashing into competing agendas. For example, the industry wanted lossless exchange of formatting tags in TMX, but various TM tools implemented this function in two fundamentally different ways that made interchange impossible. No approach to creating the standard would not pick winners and losers, and in the end the tension prevented release of TMX 2.0. Similar problems delayed the development of Linport for four years and it has yet to regain its former momentum.
     
  • Academic focus. Standards bodies such as the International Organization for Standardization (ISO) frequently have a substantial cadre of academic researchers involved in their committees. Although these individuals bring considerable expertise and knowledge about the latest developments to the table, their input needs to be balanced against the needs of implementers.
     
  •  Lack of buyer involvement. Buyers of translation services typically see interoperability problems as issues for LSPs and technology developers. The W3C, OASIS, and ASTM F43 have done well in encouraging buyer involvement for specific standards, but most efforts lack real involvement from buyers. Other parties see this as a lack of concern and demand, which leads to low involvement and commitment. On the other hand, if buyers are involved and take a leading role, other parties respond accordingly.
     
  • Personality-driven outcomes. Effective standards development requires a delicate balance between stakeholders. However, in some committees, individuals take over and push their own agendas, which may not reflect the needs of the broader group. If these individuals are in positions of leadership, it results in withdrawal of support and industry adopters see the standards that emerge as irrelevant or even wrong.
     
  • Forking. In development circles, “forking” refers to cases where an effort is split in two, with contributors taking them in different directions. Taking the concept loosely, XLIFF 1.x was a well-defined standard, but individual developers forked implementations in ways that kept it from being truly interoperable – the result was that XLIFF from one CAT tool might or might not work in another. Even if one implementation fork made changes for a good reason, the net effect was that the standard was no longer a standard in practical terms.

What’s the solution? Standardization efforts need to be well-defined and avoid scope creep. They need broad buy-in from relevant stakeholders with balanced representation of the various groups. It isn’t enough for major players to wait on the sidelines under the theory that they will benefit from the results even if they don’t participate. Crucially, buyers of language services have to commit to implement the results and require their suppliers to do the same. Unless buyers hold LSPs and technology vendors to account for their support or lack thereof, the status quo will not change.

Can we overcome these problems? Leaders of major LSPs attending CSA Research’s recent CEO Leadership Council in Paris raised this topic as a major concern for them: The industry desperately needs standards with support from enterprises, technology developers, and LSPs. The last twenty years have seen both good and bad examples of standards. The key will be to focus on what made the best ones work to deliver concrete and achievable outcomes and for all stakeholders to realize that the language industry cannot afford to wait for someone else to save the day. If you have thoughts you would like to share about standards and how the industry should move forward, please send them to me at alommel@csa-research.com.

Image: © 2019, Dale Schultz (https://cabin-layout.mixmox.com). Used with permission.

About the Author

Arle  Lommel

Arle Lommel

Senior Analyst

Focuses on language technology, artificial intelligence, translation quality, and overall economic factors impacting globalization

Related

Happy 30th Birthday, Unicode!

Happy 30th Birthday, Unicode!

In October 1991, Unicode 1.0 was first released. In the 30 years since that publication an entire ge...

Read More >
Thinking Big about Interoperability

Thinking Big about Interoperability

Mention “interoperability” and many localizers think of yet another conference panel about the val...

Read More >
TBX:2019: A New Version of the ISO Standard Raises the Bar

TBX:2019: A New Version of the ISO Standard Raises the Bar

Localization industry veterans may recall when the OSCAR standards group in the now-defunct Localiza...

Read More >
TAPICC – Because No One Has Time for Closed Systems

TAPICC – Because No One Has Time for Closed Systems

The history of standards for data and file exchange formats in the language industry goes back to th...

Read More >
Brown M&Ms and Bad Locale Tags

Brown M&Ms and Bad Locale Tags

In the 1980s, the American rock band Van Halen became famous for including a requirement in contract...

Read More >
Intelligent Content Goes Global

Intelligent Content Goes Global

Intelligent or smart content has been a dream since the late 1990s. The concept refers to text, data...

Read More >

Subscribe

Name

Categories

Follow Us on Twitter