Buscar
Mostrando ítems 1-4 de 4
Parsing as Pretraining
(2020)
[Abstract] Recent analyses suggest that encoders pretrained for language
modeling capture certain morpho-syntactic structure.
However, probing frameworks for word vectors still do not report
results on standard setups ...
Cognitive Constraints Built into Formal Grammars: Implications for Language Evolution
(Ravignani, A., Barbieri, C., Martins, M., Flaherty, M., Jadoul, Y., Lattenkamp, E., Little, H., Mudd, K., Verhoef, T., 2020-04-17)
[Abstract] We study the validity of the cognitive independence assumption using an ensemble of artificial syntactic structures from various classes of dependency grammars. Our findings show that memory limitations have ...
Towards Robust Word Embeddings for Noisy Texts
(MDPI, 2020)
[Abstract] Research on word embeddings has mainly focused on improving their performance on standard corpora, disregarding the difficulties posed by noisy texts in the form of tweets and other types of non-standard writing ...
Transition-based Semantic Dependency Parsing with Pointer Networks
(Association for Computational Linguistics (ACL), 2020-07)
[Abstract]: Transition-based parsers implemented with Pointer Networks have become the new state of the art in dependency parsing, excelling in producing labelled syntactic trees and outperforming graph-based models in ...