11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      UdS Submission for the WMT 19 Automatic Post-Editing Task

      Preprint
      , ,

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In this paper, we describe our submission to the English-German APE shared task at WMT 2019. We utilize and adapt an NMT architecture originally developed for exploiting context information to APE, implement this in our own transformer model and explore joint training of the APE task with a de-noising encoder.

          Related collections

          Most cited references15

          • Record: found
          • Abstract: not found
          • Conference Proceedings: not found

          Neural Machine Translation of Rare Words with Subword Units

            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation

              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              Improving the Transformer Translation Model with Document-Level Context

                Author and article information

                Journal
                09 August 2019
                Article
                1908.03402
                78e2b8ac-2ea5-47fc-bb7f-bb7c8e497d97

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                WMT 2019 Automatic Post-Editing Shared Task Paper
                cs.CL

                Theoretical computer science
                Theoretical computer science

                Comments

                Comment on this article

                Related Documents Log