Category: Translation Business

The Translation Industry in 2022

The Translation Industry in 2022

In this report, TAUS shares predictions for the future of the translation industry in line with their expectation that automation will accelerate in the translation sector during the coming 5 years. The anticipated changes will inevitably bring along various challenges and opportunities all of which are explained thoroughly in the Translation Industry in 2022 Report.

The report explains the following 6 drivers of change

1. Machine Learning

Machine learning (ML) was introduced in the 1950s as a subset of artificial intelligence (AI), to have programs feed on data, recognize patterns in it, and draw inferences from them. 2016 was the year when ML went mainstream, with a lot of applications that were almost unimaginable a few years earlier – image recognition and self-driving cars are just two examples.Computational power and unprecedented advances in deep neural networks will make data-driven technologies astonishingly disruptive. This might be also the case of MT.

As a rule, the growth of machine intelligence represents a threat to many human jobs as people will be replaced by intelligent systems.  The majority of creative jobs is relatively safe while sales jobs could be at risk. The forecast is dubious for technology jobs, but the more senior jobs being relatively secure, while computer programmers and support workers may likely be replaced.  The assumption that jobs requiring manual dexterity, creativity, and social skills are the hardest to computerize is already obsolete: new developments in deep learning are making machines more powerful than anticipated, especially in areas relating to creativity and social interaction. 

In the translation industry – as in other industries – many functions will be affected – whether enhanced, expanded or replaced – by ML.

2. Machine Translation

In the past years NMT has been said to be achieving impressive results, and it is more and more often presented as a replacement for SMT. Advances in artificial neural networks are bringing extremely high expectations, suggesting that NMT could rapidly achieve higher accuracy than SMT. Independent evaluators fnd that NMT translations are more fluent and more accurate in terms of word order compared to those produced by phrase-based systems. Better quality MT will mean that a broader range of document types and audiences can be addressed.
NMT will help the further expansion of speech-to-speech (S2S) technologies, now available mostly as English-based monolingual systems. Transforming these into multilingual systems implies many deep and expensive changes. Most S2S technologies are still at an infancy stage and confned to university labs. NMT will help bring speech-enabled devices to the streets.

MT will lead to the ultimate disruption in the translation industry when, only the premium segment of artsy—and possibly life sciences—
translation will remain tradable.

3. Quality Management

Due to the uncertainties intrinsically involved in translation quality assessment, and the fixity of the relevant concepts in the translation community, users seem now willing to accept good enough MT output, especially for large volumes, delivered virtually in real time. For serviceable MT output with no human intervention downstream, TAUS coined the acronym FAUT (Fully Automated Useful Translation) already in 2007. Investing in quality-related decision support tools has become essential to gain translation project insights and beneft from MT.
Applying machine learning to data-driven translation quality assessment will be a disruptive innovation that will call for a major shift in conception and attitude.  Data-driven applications in translation quality assessment will go from document classifiers to style scorers, from comparison tools to automatic and predictive quality assessment, from content sampling to automatic error detection and identification. The data-driven approach to quality will require another major attitude shift.

4. Data

There is a strong need for data scientists/specialists/analysts, but this profile is still absent from the translation industry.

Data has been the fuel of automation, and after entering the automation era at full speed, we are being challenged with many issues.  Translation data is typically metadata: data about translation that can be harvested downstream the closure of a translation project/job/task.  The analysis of translation data can provide a very valuable insight into the translation processes to find the best resource for a job, to decide what to translate and which technology to use for which content. Translation data will be more and more frequently generated by algorithms. More data will come from rating staff and KPIs. All these kinds of data will come from ML applied to translation management platforms, which will get rid of human involvement.

Erroneously, also data covering multilingual text resources is labeled as translation data. In fact, language data specifically consists of translation memories, corpora, and lexicographical and terminological collections. Of course, all these resources have metadata too, which could be exploited. Stakeholders should become more open and massively start sharing their translation metadata to make it the real big data of the translation industry.

There is a strong need for data scientists/specialists/analysts, but this profile is still absent from the translation industry. Translation companies should be looking out for these specialists who can mine and use data for automation. This will most probably lead to a further reduction of the number of translation companies that are able to float and thrive in a more and more competitive market. The challenge for the next few years might be the standardization of translation data in order to shape it and make it convenient for users to derive the maximum benefits from it.

5.  Interoperability

Interoperability is the ability of two different systems to communicate and work together through a common language or interface. While many other industries have flourished thanks to standardization which led to interoperability, automation and innovation, the translation industry has always suffered from a lack of interoperability. This has been costing a fortune for years, both on the client side (in translation
budgets) and on the vendor side (in revenues).
  Things have been changing a lot since 2011, when TAUS published a report on the costs from
lack of interoperability in the translation industry
. Many blame the lack of compliance to interchange format standards as the primary barrier to interoperability, and no one believes any longer that true interoperability in the translation industry can be achieved only through awareness programs, education, and certifications. Interoperability should come from the adoption of standards created by consortia and not from the dominance of a market leader.

The spreading of MT has forced a breakthrough in the interoperability dilemma, starting a wave of innovation and renewed efforts. Most of these efforts have still been focusing on APIs though, as XML has been established for years as the common language, bringing everyone the industry to find its child formats TMX and XLIFF essentially enough.  So far, most of the many APIs made available are meant to simplify the translation business process and reduce translation management and overhead cost. Only a few have been designed to help disintermediation and facilitate access to services.

In this case, we could expect that the most influential buyers of localization and translation services will advance their requests; the technology vendors with the necessary technological and financial resources will fulfill those requests or even introduce their own solutions on the market, just as it happened in the past.

6.  Academy

Translation education is vocational by definition: it prepares people to work in the trade as translators. None of the skills translation students acquire is anything but sophisticated.  Today, many players in the translation industry complain about the lack of good translators, but they seem to ignore that, more than in many other academic fields, translation education follows obsolete models that are still shaped for the 20th century. To make matters worse, the gap between the academic world and the industry is so wide that, when approaching the job market, translation graduates instantly and bitterly realize they don’t know much about the actual work they are supposed to do. They also discover that the world is not interested in their basic skills.

The future may not really need translators, at least not in the old way, as the audience will become even more forgiving for lesser quality of fast-moving content. A highly-automated localization environment will depend on human skills in quality evaluation, content profiling, cultural advisory, data analysis, computational linguistics, and gradually less and less in post-editing; translating plain text will indeed be a long-tail business.

The success of any innovation depends entirely on the people that are going to nurture, develop, and implement it; in times of exponential growth, education is vital to drive adoption and prepare the next generations of workers. Employers should play a part in closing the skills gap with continued professional training. It is never too early to prepare for the future; vast workforce and organizational changes are necessary to upend stale business models and related processes.

For more details, download the full report.

Print Friendly, PDF & Email
‘Human Parity Achieved’ in MT

‘Human Parity Achieved’ in MT

According to Microsoft’s March 14, 2018 research paper with the full title of “Achieving Human Parity on Automatic Chinese to English News Translation,” a few variations of a new NMT system they developed have achieved “human parity,” i.e. they were considered equal in quality to human translations (the paper defines human quality as “professional human translations on the WMT 2017 Chinese to English news task”).

Microsoft came up with a new human evaluation system to come to this convenient conclusion, but first they had to make sure “human parity” was less nebulous and more well-defined.

Microsoft’s definition for human parity in their research is thus: “If a bilingual human judges the quality of a candidate translation produced by a human to be equivalent to one produced by a machine, then the machine has achieved human parity.”

In mathematical, testable terms, human parity is achieved “if there is no statistically significant difference between human quality scores for a test set of candidate translations from a machine translation system and the scores for the corresponding human translations.”

Microsoft made everything about this new research open source, citing external validation and future research as the reason.

Reference: https://goo.gl/3iFXXG

Print Friendly, PDF & Email
3 Tips for Preparing Video Game Localization Kit

3 Tips for Preparing Video Game Localization Kit

An essential part of a video game localization project is planning. The first step in the planning process is to perform a pre-localization analysis to determine the budget and the depth of localization. For example, will the localization consist of translating only the marketing material and player’s manual or will it be a complete localization that will also translate the gaming text and modify the video in the game. Once the pre-localization has been completed, it is time to develop a video game localization kit.

Read More Read More

Print Friendly, PDF & Email
Video Game Localization & Cultural Adaptation

Video Game Localization & Cultural Adaptation

The evolution of video games since their inception in the 1970’s has exploded into the Internet Age and morphed into a worldwide phenomenon. As the games, developers, and players have become more sophisticated, entire fields and professions have been created to meet the challenges of marketing and selling the games around the world. A crucial step in that process is called localization, and is a process to adapt the game to its new target audience. Localization could simply be translating and redesigning the packaging, or more extensive such as changing the scenes in the game and the appearance of the characters to appeal to the players in the new market. Cultural adaptation, or culturalization is a more in depth process to make sure that the game is free from cultural barriers to full acceptance by gamers in the target country/culture.

Read More Read More

Print Friendly, PDF & Email
5 Essentials for Successful Game Localization

5 Essentials for Successful Game Localization

Localizers have a critical role to play in the development of games. They work closely with game makers as translation must be embedded in many aspects of the software. This includes character names, rules of the game, help topics, weapons, and so on. The major disadvantage that comes with being a localizer is time and scheduling constraints as needs to be incorporated as the game is being developed, not after it is completed. This situation comes with its own set of challenges. Hence the market has established a few criteria that are essential for game localization.

Read More Read More

Print Friendly, PDF & Email
Game Localization History: Brief Overview

Game Localization History: Brief Overview

Game localization is very important in the video game industry and has played a key role in the incredible growth of the video game industry. It has allowed the industry to sell their games in every country worldwide and enabled enjoyment of those games by thousands if not millions of people. It has made the video game industry a very lucrative business. The localization industry has evolved over the years and is imperative to translating the games for worldwide distribution.

Read More Read More

Print Friendly, PDF & Email
Machine Translation Post-Editing Types

Machine Translation Post-Editing Types

Post Editing is the next step after completing the machine translation (MT) process and evaluating its output. A human translator processes the document to verify that the source and target texts convey the same information and that the tone of the translation is consistent with the original document. The quality of machine translation varies and affects the subsequent effort required for post editing. There are contributory factors to the quality of the MT such as the clarity and quality of the source text; it is important to make sure that the source text is well-written and well-suited for machine translation beforehand. Other considerable factors that affect MT output quality include: the type of MT used, and the compatibility of the source and target languages.

There are two types or levels of post editing

Read More Read More

Print Friendly, PDF & Email