AI-Based_Sustainable_and_Intelligent_Offloading_Framework_for_IIoT_in_Collaborative_Cloud-Fog_Environments (1)-1
AI-Based_Sustainable_and_Intelligent_Offloading_Framework_for_IIoT_in_Collaborative_Cloud-Fog_Environments (1)-1
1, FEBRUARY 2024
Abstract—The cloud paradigm is one of the most trending IIoT applications emphasize more on achieving sustainability,
areas in today’s era due to its rich profusion of services. However, ensuring health and safety, and ameliorating overall system
it fails to serve the latency-sensitive Industrial Internet of Things efficiency.
(IIoT) applications associated with automotives, robotics, oil
and gas, smart communications, Industry 5.0, etc. Hence, to However, the unprecedented advancement in terms of high-
strengthen the capabilities of IIoT, fog computing has emerged as speed networking capabilities such as 5G and beyond, perva-
a promising solution for latency-aware IIoT tasks. However, the sive computing devices, mobile applications, and IoT sensors
resource-constrained nature of fog nodes puts forth another sub- are generating tremendous amounts of data that require real-
stantial issue of offloading decisions in resource management. time analytics. For instance, consider the industry 5.0 scenario,
Therefore, we propose an Artificial Intelligence (AI)-enabled
intelligent and sustainable framework for an optimized multi- which is emerging with the concept of Collaborative Robots
layered integrated cloud fog-based environment where real-time (COBOTS) which learn to work with humans in a collaborative
offloading decisions are accomplished as per the demand of manner. In contrast to industrial robots which are designated
IIoT applications and analyzed by a fuzzy based offloading con- to perform specific tasks, COBOTS are equipped with intel-
troller. Moreover, an AI based Whale Optimization Algorithm ligence to perform a diverse range of tasks with humans in
(WOA) has been incorporated into a framework that promises
to search for the best possible resources and make accurate deci- a cooperative manner, ensuring safety and ameliorating the
sions to ameliorate various Quality-of-Service (QoS) parameters. productivity of enterprises. Despite its eminent capabilities,
The experimental results show an escalation in makespan time any robotic malfunction due to delayed response could make
up to 37.17%, energy consumption up to 27.32%, and execution the situation worse. Hence, sending the data perceived from
cost up to 13.36% in comparison to benchmark offloading and IIoT sensors to centralized cloud datacenters is not a practical
allocation schemes.
solution.
Index Terms—Task offloading, Internet of Things (IoT), fog Henceforth, a novel distributed paradigm known as Fog
computing, resource allocation, artificial intelligence (AI). Computing (FC) has emerged, which leverages cloud char-
acteristics with new features like location awareness and edge
datacenter deployment. It refers to computing at the edge
I. I NTRODUCTION
of the network, enabling distributed computing solutions in
ITH the digital infrastructure revolutionizing the world
W at an expeditious rate, the Industrial Internet of Things
(IIoT) is emerging rapidly, embracing applications such as
order to maximize scalability, elasticity, resiliency, minimizing
computational costs, and efficient information sharing, among
other things [1]. However, its complicated decentralized archi-
automotives, smart cities, healthcare, waste and water man- tecture imposes various challenges in order to effectively
agement, robotics, smart communication, smart power grids utilize the underlying heterogeneous resources.
etc. These Industry 5.0 based use cases thrive on resources to The resource management issues in the fog landscape com-
process, analyze and store this colossal amount of data gener- prise resource provisioning, task offloading, task mapping, and
ated by IIoT applications. In contrast to IoT applications such service placement. In recent works, some of the researchers
as handheld devices which are consumer-centric in nature, have proposed solutions for optimal resource surveillance and
management mechanisms. However, ensuring optimality of
Manuscript received 8 July 2023; revised 21 August 2023; accepted
22 September 2023. Date of publication 29 September 2023; date of current task offloading decisions is quite crucial in a collaborative
version 26 April 2024. (Corresponding author: Mohit Kumar.) Fog-Cloud environment for Industry 5.0 use cases. The com-
Mohit Kumar and Guneet Kaur Walia are with the Department plexity of this decision comprises many factors such as which
of Information Technology, Dr. B. R. Ambedkar National Institute of
Technology Jalandhar, Jalandhar 144027, India (e-mail: kumarmohit@ task to offload, finalizing the offloading destination (Fog Nodes
nitj.ac.in; [email protected]). or a cloud datacenter instance) and, when to offload the
Haresh Shingare and Samayveer Singh are with the Department of task [2]. The delay intensive tasks are offloaded to Fog Nodes
Computer Science and Engineering, Dr. B. R. Ambedkar National Institute
of Technology Jalandhar, Jalandhar 144027, India (e-mail: hareshss.cs.20@ (FNs), consequently the processing and transmission delays
nitj.ac.in; [email protected]). are reduced to meet the Quality of Service (QoS) constraints.
Sukhpal Singh Gill is with the School of Electronic Engineering and To serve these capabilities, our work aims to propose an effec-
Computer Science, Queen Mary University of London, E1 4NS London, U.K.
(e-mail: [email protected]). tive task offloading strategy which satisfy the QoS parameters
Digital Object Identifier 10.1109/TCE.2023.3320673 such as optimizing makespan, cost and task rejection ratio.
1558-4127
c 2023 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.
Authorized licensed use limited to: NATIONAL INSTITUTE OF TECHNOLOGY JALANDAR. Downloaded on July 14,2024 at 05:21:26 UTC from IEEE Xplore. Restrictions apply.