Buscar
Mostrando ítems 1-10 de 33
Increasing NLP Parsing Efficiency with Chunking
(M D P I AG, 2018-09-19)
[Abstract] We introduce a “Chunk-and-Pass” parsing technique influenced by a psycholinguistic model, where linguistic information is processed not word-by-word but rather in larger chunks of words. We present preliminary ...
Tratamiento sintáctico de la negación en análisis del sentimiento monolingüe y multilingüe
(2017-09-19)
[Abstract] Dealing with negation in a proper way is a relevant factor in order to obtain high performance sentiment analysis systems. In this framework, we present a method for the treatment of negation in Spanish that ...
Sequence Tagging for Fast Dependency Parsing
(2019)
[Abstract]
Dependency parsing has been built upon the idea of using parsing methods based on shift-reduce or graph-based algorithms in order to identify binary dependency relations between the words in a sentence. In this ...
Parsing as Pretraining
(2020)
[Abstract] Recent analyses suggest that encoders pretrained for language
modeling capture certain morpho-syntactic structure.
However, probing frameworks for word vectors still do not report
results on standard setups ...
On the Challenges of Fully Incremental Neural Dependency Parsing
(Association for Computational Linguistics, 2023-11)
[Absctract]: Since the popularization of BiLSTMs and
Transformer-based bidirectional encoders,
state-of-the-art syntactic parsers have lacked
incrementality, requiring access to the whole
sentence and deviating from ...
A Unifying Theory of Transition-based and Sequence Labeling Parsing
(International Committee on Computational Linguistics, 2020-12)
[Absctract]: We define a mapping from transition-based parsing algorithms that read sentences from left to right to sequence labeling encodings of syntactic trees. This not only establishes a theoretical relation between ...
Better, Faster, Stronger Sequence Tagging Constituent Parsers
(Association for Computational Linguistics, 2019-06)
[Absctract]: Sequence tagging models for constituent parsing are faster, but less accurate than other types of parsers. In this work, we address the following weaknesses of such constituent parsers: (a) high error rates ...
Sequence Labeling Parsing by Learning across Representations
(Association for Computational Linguistics, 2019-07)
[Absctract]: We use parsing as sequence labeling as a common framework to learn across constituency and dependency syntactic abstractions. To do so, we cast the problem as multitask learning (MTL). First, we show that ...
Harry Potter and the Action Prediction Challenge from Natural Language
(Association for Computational Linguistics, 2019-06)
[Absctract]: We explore the challenge of action prediction from textual descriptions of scenes, a testbed to approximate whether text inference can be used to predict upcoming actions. As a case of study, we consider the ...
HEAD-QA: A Healthcare Dataset for Complex Reasoning
(Association for Computational Linguistics, 2019-07)
[Absctract]: We present HEAD-QA, a multi-choice question answering testbed to encourage research on complex reasoning. The questions come from exams to access a specialized position in the Spanish healthcare system, and ...