Take one for the team: on the time efficiency of application-level buffer-aided relaying in edge cloud communication

Use este enlace para citar
http://hdl.handle.net/2183/27800Coleccións
- Investigación (FIC) [1634]
Metadatos
Mostrar o rexistro completo do ítemTítulo
Take one for the team: on the time efficiency of application-level buffer-aided relaying in edge cloud communicationData
2021-03-12Cita bibliográfica
Li, Z., Millar-Bilbao, F., Rojas-Durán, G. et al. Take one for the team: on the time efficiency of application-level buffer-aided relaying in edge cloud communication. J Cloud Comp 10, 24 (2021). https://doi.org/10.1186/s13677-021-00241-x
Resumo
[Abstract]
Background
Adding buffers to networks is part of the fundamental advance in data communication. Since edge cloud computing is based on the heterogeneous collaboration network model in a federated environment, it is natural to consider buffer-aided data communication for edge cloud applications. However, the existing studies generally pursue the beneficial features of buffering at a cost of time, not to mention that many investigations are focused on lower-layer data packets rather than application-level communication transactions.
Aims
Driven by our argument against the claim that buffers “can introduce additional delay to the communication between the source and destination”, this research aims to investigate whether or not (and if yes, to what extent) the application-level buffering mechanism can improve the time efficiency in edge-cloud data transmissions.
Method
To collect empirical evidence for the theoretical discussion, we built up a testbed to simulate a remote health monitoring system, and conducted both experimental and modeling investigations into the first-in-first-served (FIFS) and buffer-aided data transmissions at a relay node in the system.
Results
An empirical inequality system is established for revealing the time efficiency of buffer-aided edge cloud communication. For example, given the reference of transmitting the 11th data entity in the FIFS manner, the inequality system suggests buffering up to 50 data entities into one transmission transaction on our testbed.
Conclusions
Despite the trade-off benefits (e.g., energy efficiency and fault tolerance) of buffering data, our investigation argues that the buffering mechanism can also speed up data transmission under certain circumstances, and thus it would be worth taking data buffering into account when designing and developing edge cloud applications even in the time-critical context.
Palabras chave
Atomic data entity
Buffering
Communication performance
Edge cloud computing
Time efficiency
Buffering
Communication performance
Edge cloud computing
Time efficiency
Versión do editor
Dereitos
Atribución 3.0 España © The Author(s). 2021 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which
permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit
to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The
images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated
otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended
use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the
copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
ISSN
2192-113X