Use this link to cite:
http://hdl.handle.net/2183/42130 Efficient Single-Step Framework for Incremental Class Learning in Neural Networks
Loading...
Identifiers
Publication date
Authors
Dopico Castro, Alejandro
Other responsabilities
Universidade da Coruña. Facultade de Informática
Journal Title
Bibliographic citation
Type of academic work
Academic degree
Abstract
[Abstract]: Incremental learning continues to present a significant challenge in deep learning, particularly in environments where resources are limited. While existing methods have been shown to achieve high levels of accuracy, they often require substantial computational resources and storage capacity. This work proposes CIFNet, an efficient approach to class incremental learning that matches comparable accuracy to state-of-the-art methods while significantly reducing training time and energy consumption through a single-step optimisation process. CIFNet incorporates a novel compressed buffer mechanism that stores condensed representations of previous data samples instead of full raw data, substantially reducing memory requirements. In contrast to conventional approaches that necessitate multiple iterations of weight optimisation, our method achieves optimal performance in a single training step, with no need for iterations, leading to a significant decrease in computational overhead. Experimental results on standard benchmark datasets have shown that our approach inherently mitigates catastrophic forgetting without the need for complex regularization schemes. CIFNet achieves accuracy comparable to current state-of-the-art approaches while significantly reducing training time and energy consumption. This work represents a step forward in making class incremental learning more accessible for resource-constrained environments while maintaining robust performance.
Description
Editor version
Rights
Atribución-CompartirIgual 3.0 España








