Performance and sustainability of BERT derivatives in dyadic data

Use este enlace para citar
http://hdl.handle.net/2183/40124
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution 4.0 International (CC BY)
Colecciones
- Investigación (FIC) [1615]
Metadatos
Mostrar el registro completo del ítemTítulo
Performance and sustainability of BERT derivatives in dyadic dataAutor(es)
Fecha
2025-03Cita bibliográfica
M. Escarda, C. Eiras-Franco, B. Cancela, B. Guijarro-Berdiñas, and A. Alonso-Betanzos, "Performance and sustainability of BERT derivatives in dyadic data", Expert Systems with Applications, Vol. 2621, article number 125647, March 2025, https://doi.org/10.1016/j.eswa.2024.125647
Resumen
[Abstract]: In recent years, the Natural Language Processing (NLP) field has experienced a revolution, where numerous models – based on the Transformer architecture – have emerged to process the ever-growing volume of online text-generated data. This architecture has been the basis for the rise of Large Language Models (LLMs). Enabling their application to many diverse tasks in which they excel with just a fine-tuning process that comes right after a vast pre-training phase. However, their sustainability can often be overlooked, especially regarding computational and environmental costs. Our research aims to compare various BERT derivatives in the context of a dyadic data task while also drawing attention to the growing need for sustainable AI solutions. To this end, we utilize a selection of transformer models in an explainable recommendation setting, modeled as a multi-label classification task originating from a social network context, where users, restaurants, and reviews interact.
Palabras clave
BERT
Dyadic data
Sustainable AI
Transformer
Dyadic data
Sustainable AI
Transformer
Versión del editor
Derechos
Attribution 4.0 International (CC BY)