Fast and Frugal Transfer Learning via Precomputed Features and Adaptive Normalization

Bibliographic citation

Vila-Cruz, D., Bolón-Canedo, V., Morán-Fernández, L. (2026). Fast and Frugal Transfer Learning via Precomputed Features and Adaptive Normalization. In: Martínez, L., et al. Intelligent Data Engineering and Automated Learning – IDEAL 2025. Lecture Notes in Computer Science, vol 16238. Springer, Cham. https://doi-org.accedys.udc.es/10.1007/978-3-032-10486-1_14

Type of academic work

Academic degree

Abstract

[Abstract]: Deep learning models, particularly convolutional neural networks (CNNs), have achieved state-of-the-art performance in medical image classification. However, their deployment in real-world clinical environments is often constrained by hardware limitations, energy requirements, and the time-intensive nature of model fine-tuning. In this work, we propose a lightweight and energy-aware training strategy that decouples feature extraction from classifier optimization. By precomputing features and adapting batch normalization statistics with a sample-aware thresholding mechanism, we reduce computational overhead without sacrificing accuracy. A redesigned classifier head is trained using a margin-based weighted loss, which emphasizes ambiguous cases without requiring end-to-end backpropagation. Experimental results on two widely used medical imaging datasets, Brain Cancer MRI and BreakHis, demonstrate that our pipeline significantly reduces training time and CO2 emissions while achieving competitive or superior accuracy compared to traditional fine-tuning approaches. This makes our method well-suited for resource-constrained settings or rapid prototyping environments.

Description

Presentado en: IDEAL 2025: International Conference on Intelligent Data Engineering and Automated Learning, 26th International Conference, Jaén, Spain, November 13–15, 2025 This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: https://doi.org/10.1007/978-3-032-10486-1_14

Rights

Copyright © 2026, The Author(s), under exclusive license to Springer Nature Switzerland AG