han anticipated, and that progress had been painfully slow. It should be mentioned that just over five years earlier Joshua Bar-Hillel, one of the first enthusiasts for machine translation who had been disabused of his work, had published his critical review of machine translation research in which he had rejected the implicit aim of fully automatic high quality translation (FAHQT). Indeed he provided a proof of its "Non-feasibility". The writers of the ALPAC report agreed with this diagnosis and recommended that research on fully automatic systems should stop and that attention should be directed to lower-level aids for translators.
For some years after ALPAC, research continued on a much-reduced financing. By the mid 1970s, some success could be shown: in 1970 the US Air Force began to use the Systran system for Russian-English translations, in 1976 the Canadians began public use of weather reports translated by the Meteo sublanguage machine translation system, and the Commission of the European Communities applied the English-French version of Systran for helping it with its heavy translation burden - which soon was followed by the development of systems for other European languages. In the 1980s, machine translation rose from its post-ALPAC low spirits: activity began again all over the world - most notably in Japan - with new ideas for research (particularly on knowledge-based and interlingua-based systems), new sources of financial support (the European Union, computer companies), and in particular with the appearance of the first commercial machine translation systems on the market.
Initially, however, attention to the renewed activity was still almost focuses on automatic translation with human assistance, both before (pre-editing), during (Interactive solution of problems) and after (post-editing) the translation process itself. The development of computer-based aids or tools for use by human translators was still relatively neglected - despite the explicit requests of translators.
Nearly all research activities in the 1980s were devoted to the exploration of methods of linguistic analysis in order to create generation of programs based on traditional rule-based transfer and interlingua (AI-type knowledge bases representing the more innovative tendency). The needs of translators were left to commercial interests: software for terminology management became available and ALPNET produced a series of translator tools during the 1980s - among them it may be noted was an early version of a program "Translation Memory "(a bilingual database). br/>
Machine Translation in 1990s
The real emergence of translator aids came in the early 1990s with the "translator workstation ", among them were such programs as" Trados Translator Workbench "," IBM Translation Manager 2 "," STAR Transit "," Eurolang Optimizer ", which combined sophisticated text processing and publishing software, terminology management and translation memories.
In the early 1990s, research on machine translation was reinforced by the coming of corpus-based methods, especially by the introduction of statistical methods ("IBM Candide ") and of example-based translation. Statistical (stochastic) techniques have brought a reliase from the increasingly evident limitations and inadequacies of previous exclusively rule-based (often syntax-oriented) approaches. Problems of disambiguation, refraining from repetition and more idiomatic generation have become more solvable with corpusbased techniques. On their own, statistical methods are no more the answer in contrast to rule-based methods, but there are now prospects of improved output quality which did not seem reachable 15 years ago. As many observers have indicated, the most promising approaches will probably integrate rule-based and corpus-based methods. Even outside research environments integration is already evident: many commercial machine translation systems now incorporate translation memories, and many translation memory systems are being enriched by machine translation methods.
The main feature of the 1990s has been the rapid increase in the use of machine translation and translation tools. The globalization of commerce and information is placing increasing demands upon the provision of translations. It means not only continuing (maybe even accelerating) growth of the use by multinational companies and translation services of systems to assist in the production of good quality documentation in many languages ​​- by the use of machine translation and translation memory systems or by multilingual document authoring systems, or by combinations of both. Until recent times, the production of translations has been seen essentially as a self-contained activity. For large us...