Performance and sustainability of BERT derivatives in dyadic data

Use this link to cite
http://hdl.handle.net/2183/40124
Except where otherwise noted, this item's license is described as Attribution 4.0 International (CC BY)
Collections
- Investigación (FIC) [1615]
Metadata
Show full item recordTitle
Performance and sustainability of BERT derivatives in dyadic dataAuthor(s)
Date
2025-03Citation
M. Escarda, C. Eiras-Franco, B. Cancela, B. Guijarro-Berdiñas, and A. Alonso-Betanzos, "Performance and sustainability of BERT derivatives in dyadic data", Expert Systems with Applications, Vol. 2621, article number 125647, March 2025, https://doi.org/10.1016/j.eswa.2024.125647
Abstract
[Abstract]: In recent years, the Natural Language Processing (NLP) field has experienced a revolution, where numerous models – based on the Transformer architecture – have emerged to process the ever-growing volume of online text-generated data. This architecture has been the basis for the rise of Large Language Models (LLMs). Enabling their application to many diverse tasks in which they excel with just a fine-tuning process that comes right after a vast pre-training phase. However, their sustainability can often be overlooked, especially regarding computational and environmental costs. Our research aims to compare various BERT derivatives in the context of a dyadic data task while also drawing attention to the growing need for sustainable AI solutions. To this end, we utilize a selection of transformer models in an explainable recommendation setting, modeled as a multi-label classification task originating from a social network context, where users, restaurants, and reviews interact.
Keywords
BERT
Dyadic data
Sustainable AI
Transformer
Dyadic data
Sustainable AI
Transformer
Editor version
Rights
Attribution 4.0 International (CC BY)