Fernández-González, DanielGómez-Rodríguez, Carlos2023-03-222023-03-222023-03Fernández_González, Daniel; Gómez-Rodríguez, Carlos (2023): Discontinuous grammar as a foreign language. Neurocomputing 524: 43–580925-2312http://hdl.handle.net/2183/32741[Abstract] In order to achieve deep natural language understanding, syntactic constituent parsing is a vital step, highly demanded by many artificial intelligence systems to process both text and speech. One of the most recent proposals is the use of standard sequence-to-sequence models to perform constituent parsing as a machine translation task, instead of applying task-specific parsers. While they show a competitive performance, these text-to-parse transducers are still lagging behind classic techniques in terms of accuracy, coverage and speed. To close the gap, we here extend the framework of sequence-to-sequence models for constituent parsing, not only by providing a more powerful neural architecture for improving their performance, but also by enlarging their coverage to handle the most complex syntactic phenomena: discontinuous structures. To that end, we design several novel linearizations that can fully produce discontinuities and, for the first time, we test a sequence-to-sequence model on the main discontinuous benchmarks, obtaining competitive results on par with task-specific discontinuous constituent parsers and achieving state-of-the-art scores on the (discontinuous) English Penn Treebank.engAtribución-NoComercial-SinDerivadas 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc-nd/3.0/es/Natural language processingComputational linguisticsParsingDiscontinuous constituent parsingNeural networksDeep learningSequence-to-sequence modelDiscontinuous grammar as a foreign languagejournal articleopen access