Pages

Wednesday, August 18, 2010

The Problem with Standards in the Localization Industry

In my continuing conversation with Renato Beninatto we recently talked about standards:


It is clear from this conversation that the word “standards” is a source of great confusion in the professional translation world. Part of the problem is conflation and part of the problem is definition or lack of “clear” definition on what is meant by standards especially as they relate to quality. In the conversation, we both appear to agree that data interchange is becoming much more critical and it would be valuable to the industry to have robust data interchange standards, however, we both feel that overall process standards like EN15038 have very little value in practical terms.


This discussion on quality standards in particular is often difficult because of conflation, i.e. very different concepts being equated and assumed to be the same. I think we have at least 3 different concepts that are being referenced and confused as being the same concept,  in many discussions on “quality”.
  1. End to End Process Standards: ISO 9001, EN15038, Microsoft QA and LISA QA 3.1. They have a strong focus is on administrative, documentation, review and revision processes not just the linguistic quality assessment of the final translation.
  2. Automated SMT System Output Translation Quality Metrics (TQM): BLEU, METEOR, TERp, F-Measure, Rouge and several others that only focus on rapidly scoring MT output by assessing precision and recall and referencing one or more human translations of the exact same source material to develop this score.(Useful for MT system developers but not much else).
  3. Human Evaluation of Translation Linguistic Quality: Error categorization and subjective human quality assessment, usually at a sentence level. SAE J2450, the LISA Quality Metric and perhaps the Butler Hill TQ Metric (that Microsoft uses extensively and TAUS advocates) are examples of this.(Can vary greatly depending on the humans involved.)
To this hot mess you could also add the “container” standards discussions, to further obfuscate matters. These include TMX, TBX, SRX, GMX-V, xml:tm etc.. Are any of these standards, even by the much looser definition of “standard” in the software world? If you look at the discussions on quality and standards in translation around the web we can see that a real dialog is difficult and clarity on this issue is virtually impossible.


But standards are needed to scale and handle the volume of translation that will likely be done and enable greater inter-process automation as we head into a world  where we continuously translate dynamic streams of content. Free online MT services have given the global enterprise a taste for what translation as a utility looks like. Now some want to see if it can be done better and in a more focused way at higher quality levels to enhance global business initiatives and expand the dialog with the global customer. (I think it can be done much better with customized, purpose-driven MT working with and steered by skilled language professionals). Translation as a utility is a concept that describes an always-on, on-demand, streaming translation service that can translate high value streams of content at defined quality levels for reasonable rates. Data will flow in and out of authoring, content management, social networks, translation workflow, MT and TM systems.


As this new mode of production gains momentum, I believe that it would be useful to the industry in general to have a meaningful and widely used measure of relative translation quality i.e. average linguistic quality of a target corpus. This would facilitate the production processes for 10X and 100X increases in content volume, and allow LSPs to define and deliver different levels of quality using different production models. I am convinced that the best translation production systems will be man-machine collaborations, as we already know what free online raw MT looks like.(Useful sometimes for getting the gist of a text, but rarely useful for enterprise use). Skilled humans who understand translation automation tools and know how to drive and steer linguistic quality in these new translation production models can dramatically change this reality.


TDIS
It would also be useful to have robust data interchange standards. I recently wrote an entry about the lack of robust data interchange standards that seemed to resonate. We are seeing that content on the internet is evolving from an HTML to an XML perspective. This makes it easier for content to flow in and out of key business processes. Some are suggesting soon all the data will live in the cloud and applications will decline in importance as translators zero in on what they do best and only what they do best: translate. Today, they too much time is spent on making the data usable today.


There are some data standard initiatives that could build momentum e.g. XLIFF 2.0,  but these initiatives will need more volunteer involvement (as I was reminded by “Anonymous” to include people like me to actually walk the walk and not just talk about it)  and broad community support and engagement. The problem is that there is no one place to go to for standards. LISA? OASIS? W3C? ISO TC37? How do we get these separate efforts to collaborate and produce single unified specifications that have authority and MUST be adhered to? There are others who have lost faith in the industry associations and expect that the standards will most likely come from outside the industry, perhaps inadvertently from people like Google and Facebook who implement an open XML-based data interchange format. Or possibly this could come from one of the open translation initiatives that seem to be growing in strength across the globe.


There are at least two standards (that are well defined and used by many) that I think would really be helpful to make translation as a utility happen:
  1. A linguistic quality rating that is at least somewhat objective, can be easily reproduced/replicated and can be used to indicate the relative linguistic quality of both human translated and various MT systems output. This would be especially useful to LSPs to understand post-editing cost structures and help establish more effective pricing models for this kind of work that if fair to both customers and translators.
  2. A robust, flexible yet simple data interchange standard that protects linguistic assets (TM, terminology, glossary) but can also easily be exported to affiliated processes (CMS, DMS, Web Content). 
Anyway it is clear to me that we do need standards and that it is likely that this will require open innovation and collaboration on a scale and in ways that we have not seen yet. This can start simply though with a discussion right here, that could guide or provide some valuable feedback to the existing initiatives and official bodies.We need to both talk the talk and also walk the walk. My intent here is to raise the level of awareness on this issue though I am also aware that I do not have the answers. I am also focused on a problem that few see as a  real one yet. I invite any readers who may wish to contribute to this forum to further this discussion, even if you choose to do this anonymously or possibly as a guest post. (No censorship allowed here)


Perhaps we should take heed of of what John Anster and W H Murray (not Goethe) said as we move forward: 
Until one is committed, there is hesitancy, the chance to draw back, always ineffectiveness, concerning all acts of initiative and creation. There is one elementary truth, the ignorance of which kills countless ideas and splendid plans: that the moment one definitely commits oneself, then Providence moves too. All sorts of things occur to help one that would never have otherwise occurred.

6 comments:

  1. Thanks, Kirti for expanding on our convesation with so much insight.

    This week I came across a comment to a blog posting about Transperfect that I think illustrates perfectly my concern with unnecessary standardization of process. When I see that someone is now ISO certified, I get scared, not happy. I have known ISO since 1994 and I am still not impressed.

    Here is the comment:

    "As someone who used to work in TransPerfect's London office, I had a ringside seat in watching the company mutate from its former position as a hungry, dynamic multinational to what it has now become – a chaotic and bloated tantrum of a company that drools uncontrollably at the slightest prospect of ripping off its service providers.

    Back in 2002-2003 TPT had a fairly liberal internal attitude with regard to how its internal linguists structured their working day – you could use whichever methods you wanted to when proofreading a file, provided the work got done on time. This was actually crucial in allowing me to develop the skills I still use today as a freelance translator. They then decided that in order to become more efficient (read: in order to charge their clients more and pay their service providers less) they would obtain ISO 9001:2008 certification. This was the effective death-knell of internal and external quality, since linguists were forced to spend all their time filling out endless useless checklists and forms rather than actually translating documents."

    The original post is on the Segno di Caino blog (http://segnodicaino.blogspot.com/2010/07/we-get-letters-transperfect-is-still.html)

    ReplyDelete
  2. Great article Kirti.

    LISA's GMX-Q might be a candidate for linguistic quality rating:
    http://www.lisa.org/Global-information-m.104.0.html

    ReplyDelete
  3. Renato

    Your reference to the comments about the dysfunction and collapse at Transperfect possibly caused by the attempted standardization of essential creative work make a lot of sense from the perspective of studies that explore what motivates knowledge workers.

    Process standards like ISO and EN15038 can quite possibly have a negative impact on productivity and job satisfaction for knowledge workers or really any kind of creative work.

    The following video based on motivation research done by Daniel Pink point this out very clearly. While process and structure is of course important - the process standards mentioned above destroy creativity and mechanize what are essentially creative processes.

    http://www.youtube.com/watch?v=u6XAPnuFjJc&feature=player_embedded

    ReplyDelete
  4. In the old good times when ISO and EN15038 still had a long way to go before destroying our creativity and mechanizing our precious cognitive processes, transcreators were free to elaborate on originals and to deliver to their clients or audience what they possibly expected from them -

    http://www.goethesociety.org/pages/quotescom.html

    I would not put ISO and EN15038 on the same foot, as the first one is just the adaptation of general quality assurance standard to the language industry while the latter was basically developed by the translation industry itself in an attempt to - if I see it correctly - self-regulate itself and giving clients some clue to choose between true language businesses and simple intermediaries.

    EN15308 belongs to a business context which you and others say is quickly disappearing due to technological change, therefore no need to dismiss this feature of a dying model as a useless burden on the industry. If the industry is evolving, it will generate new opportunities and new abuse, which in turn will probably create the need for new "rules", "standards" or "habits".

    In this period of transition I would think of ways to motivate and reward good linguists to stay in the industry, rather then play down other people attempts to do their best.

    A new technology or opportunity which is really good will be easily taken on board by everyone slightly above the minimum IQ, provided there is a fair share of advantage for everyone.

    ReplyDelete
  5. Hi Kirti,

    I agree that there is some confusion regarding Localization Industry Standards. It has often been mentioned in the past. That is why we created the OASIS OAXAL recommendation: Open Architecture for XML Authoring and Localization. You can see the online TC recommendation in Wiki form: http://wiki.oasis-open.org/oaxal/FrontPage , or in PDF form: http://www.l24.cm/OAXAL - fully joined up Localization.

    As all data formats can be converted to/from XML without any loss, OAXAL uses XML as its main data format. In any case XML has now become the predominant format as you also mention.

    OAXAL provides exactly the type of SOA architecture that you describe. No need to look then - it has all been laid out for you - and yes, it does work. I have personally implemented two enterprise scale systems with it. As OAXAL is extensible, the LISA GMX/Q standard can easily be added when it is ready. It fully supports and encourages automation and makes integration with SMT systems very simple.

    Having ans Open Standards based Open Architecture prevents vendor lock in and gives direct access to data within a Localization workflow. Not only that, it works much better than propitiatory systems.

    If you require any further explanation on how to implement an OAXAL based system, please do not hesitate in contacting me.

    Best Regards,

    AZ
    Posted by Andrzej Zydroń

    ReplyDelete
  6. Some interesting updates to this at

    http://www.tausdata.org/blog/2011/01/the-dark-side-of-standards/#comment-561

    ReplyDelete