When dealing with distributed applications in Edge or Fog computing environments, the service latency that the user experiences at a given node can be considered an indicator of how much the node itself is loaded with respect to the others. Indeed, only considering the average CPU time or the RAM utilisation, for example, does not give a clear depiction of the load situation because these parameters are application- and hardware-agnostic. They do not give any information about how the application is performing from the user’s perspective, and they cannot be used for a QoS-oriented load balancing. In this paper, we propose a load balancing algorithm that is focused on the service latency with the objective of levelling it across all the nodes in a fully decentralised manner. In this way, no user will experience a worse QoS than the other. By providing a differential model of the system and an adaptive heuristic to find the solution to the problem in real settings, we show both in simulation and in a real-world deployment, based on a cluster of Raspberry Pi boards, that our approach is able to level the service latency among a set of heterogeneous nodes organised in different topology.


Proietti Mattia, G., Pietrabissa, A., & Beraldi, R. (2023). A Load Balancing Algorithm for Equalising Latency across Fog or Edge Computing Nodes. IEEE Transactions on Services Computing, 1–12. https://doi.org/10.1109/TSC.2023.3265883

  title = {A Load Balancing Algorithm for Equalising Latency across Fog or Edge Computing Nodes},
  author = {Proietti Mattia, Gabriele and Pietrabissa, Antonio and Beraldi, Roberto},
  year = {2023},
  month = {},
  journal = {IEEE Transactions on Services Computing},
  volume = {},
  number = {},
  pages = {1--12},
  doi = {10.1109/TSC.2023.3265883},
  issn = {1939-1374},
  keywords = {}