This is a paper presented by Jill Goldsberry, a recent graduate of the Global Digital Marketing and Localization Certification (GDMLC) program. This paper presents the work being produced by students of The Localization Institute’s Global Digital Marketing and Localization Certificate program. The contents of this Paper are presented to create discussion in the global marketing industry on this topic; the contents of this paper are not to be considered an adopted standard of any kind. This does not represent the official position of Brand2Global Conference, The Localization Institute, or the author’s organization.
On April 28, 2006 Google launched Google Translate, a statistical based machine translation platform that allowed individuals to translate text from one language to another. Per Wikipedia, as of May 2013, Google Translate served over 200 million people a day & supported over 100 languages. There were varying levels of accuracy, imperfections and errors depending on the source and target language pairs. In September 2016, Google announced that it was adopting neural learning as a means of correcting some of the translation errors present in their current program (CSA, 2017). Much of the advancement and applications in neural networks and artificial intelligence has been attributed to the challenges in natural language processing. The focus of neural learning is to not only translate phrases and sentences but also bring out the greater meaning of the message being conveyed. Google’s products would no longer represent the fruits of traditional computer programming exactly, but “machine learning” (Lewis-Kraus, 2016).
Many of those who have used neural learning as a means of translation have strongly associated it with human translation. They say that it is as good as human translation. Even though not much research has been done to determine its effectiveness, the validity of these claims will result in the extinction of human translation (Lewis-Kraus, 2016). There has been considerable improvement with neural machine translation when compared to statistical machine translation, but a large gap still exists between machine translation and human translation (Brusveen, 2017). Human translation will increase the acceptability of machine translation and may also give a solution to the divide between supply and language needs. Concurrently, it may enhance the importance of human translators leaving them more time to perform high-value duties.
Google is amongst the first to extensively apply neural machine translation in a production environment. Before 2016 neural machine translation was regarded as something of the future by artificial intelligence scientists but that was to change in 2016, when Microsoft and SYSTRAN launched neural translation upstaging Google by several months (CSA, 2017). However, this launch received little press coverage as compared to Google’s savvy press briefing a few months later. Google’s press release notably stated that neural machine translation (NMT) generated target text was almost indistinguishable from human translation It was noteworthy that when grading this text, the human reviewers found that neural machine translation had reduced errors between 55 and 85 percent (CSA 2017).
Nobody can accuse the tech giant, Google, for their infamous translating tool as being an inaccurate translator. The translator tool which has been in existence for almost ten years has been designed in such a way that it can translate as much as 100 different languages (Brusveen, 2017). There are several steps that were taken by the developers of the program to ensure that the program generates minimal errors. Furthermore, it can also differentiate between various distinct dialects. In as much as the translating program is very effective, it also has some errors which need to be corrected. In other words, it leaves room for improvements to be made. Therefore, Google is now adopting the use of Google Neural Machine Translation.. The engineers of Google initially developed the translation program to improve single sentence translation.
The developers of the program have programmed it to work like the human mind. In other words, they have developed a translation algorithm that works by dividing an entire sentence into different segments which are then placed side by side in a translation dictionary. It thus shows how the translation is done bi-literary. The translation algorithm first breaks the words into syntactical components which are then easily translated into other languages and dialects (Brusveen, 2017). One of the downsides of the program concerns the interpretation speed. The tech giant has had to develop layered calculations which are implemented through an optimized chip which reduces the processing speed. The result was a translation speed of about 300 milliseconds. However, improvements are still ongoing.
Remarkable advancements have been made in the application of Neural Networks and Artificial Intelligence to intricate natural language processing tests e.g. recognition of speech and machine translation. For example, Microsoft’s speech recognition method is on par with humans in recognizing spoken language (Marciano, 2016). Google has also not been left behind with regards to Neural Machine Translation, as they have made significant inroads in ensuring smooth translations of grammar.
In its history spanning over 20 years, Google’s deliberate and uncompromising approach to machine translation was borne out of a need to provide efficient translations. Since the late 1990’s, Google has worked with machine translation and has had since 2004, a wide-ranging post-editing contribution. Due to this technological experience, Google cognizant of the fact that the continuing improvement of an exceptional patented machine translation method necessitates a range of development resources, computing power and data collection that confer on the enterprise technology companies such as Microsoft and Google, a distinct advantage.
In this respect, Google came up with a distinct machine translation method and concentrated their development endeavors on an accommodating API-driven infrastructure instead of acquiring or creating a machine translation system (Lewis-Kraus, 2016). This route has enabled Google to exploit the best of machine translation technology and direct it with their linguistic know-how and their understanding in tackling localization and translation challenges. In other words, Google is delighted by the current progress in neural machine translation that will enable the company to improve service delivery to its clients.
Just like any other industry, as the Localization Industry has matured, there are notable changes taking place. The sizes of files being translated over the last 5 to 7 years are considerably reduced owing to the shifting nature of the content being published with word counts reducing from an average in 1000s to an average in the 10s. As such the result is that costs of transaction often outweigh the costs of translation. Consequently, Language Service Providers (LSP’s) like Lionbridge are gearing its efforts at minimizing these costs of a transaction through the industry’s proficient content localization lifecycle management solutions which are automated processes that manage the whole practice of retrieving content from the client’s depository up to the point where it is returned to the client (Marciano, 2016).
These systems are based on intelligent components that can break down content into segments and pass it to the respective multilingual worker at the appropriate time. The industry is advancing towards multilingual communication assurance hence the paramount importance of this intelligent infrastructure. For content to be directed through the right channels it should be understood by the system, assembled accordingly and, where human involvement is required, channeled to the right worker (Brusveen, 2017). Therefore, LSP’s such as Lionbridge, have invested in cloud-based technology platforms which give them the leeway to manage and segment localization lifecycles. This has resulted in translations being generated with reduced transaction and overhead costs. As each process is automated, the result is that labor costs have been reduced (Marciano, 2016).
In closing, artificial intelligence is restructuring the way not only Google, but ALL businesses are designing their future; be it for fraud prevention or customer service. The backbone of progress in neural learning is to automate what humans do when they interact; and have AI take on human level function; whether it be for verification technology or voice recognition (Brusveen, 2017). Correlated technology procedures, e.g. machine learning, add another dimension to artificial intelligence exchanges as there are various types of practices and technology associated to the success and development of artificial intelligence. Machine learning essentially is the capability to program an algorithm that instructs a machine what it should search for. The machine is then given the leeway to decide the best method to process raw data into analyzed data. As such, there is no denying that Google has seen vastly improved translations due to Neural Machine Translation. However, luckily for those employed in the Translation Industry, we still have a long way to go.
Advisory, C. S. (2017, January 18). Is neural MT really as good as human translation? Retrieved March 1, 2017, from http://www.commonsenseadvisory.com/default.aspx?Contenttype=ArticleDetAD&tabID=63&Aid=37896&moduleId=390
Brusveen, T. (2017, January 16). An artificial intelligence briefing. Retrieved March 1, 2017, from https://www.internetretailer.com/2017/01/16/artificial-intelligence-briefing?p=1
Lewis-Kraus, G. (2016, December 30). The great A.I. Awakening. Magazine. Retrieved from https://www.nytimes.com/2016/12/14/magazine/the-great-ai-awakening.html?_r=0
Marciano, J. (2016, December 19). Machine translation advancement: What it means for language services Retrieved from http://content.lionbridge.com/what-machine-translation-advancement-means-language-services/
Jill Goldsberry, Business Development Director for LanguageLine Solutions, is responsible for managing a diverse client base; including Software, IT, Manufacturing, Financial, Retail, Biotech, Pharma and Government.
Jill is passionate about understanding her clients’ organizational goals and objectives. She enjoys creating optimal multilingual communication solutions; while building a custom implementation and service delivery schedule model that aligns with existing technology and processes.
Jill recently completed the Global Digital Marketing and Localization Certification (GDMLC) Program and is a distinguished graduate of Purdue University. Jill resides in San Mateo, CA and has 20 years of localization, media and marketing experience across multiple verticals.
“The single biggest problem in communication is the illusion that it has taken place”.
George Bernard Shaw
please click here. The program offers dual credentials, with a Certificate from the University of North Carolina Wilmington and a Certification from The Localization Institute.