Explore chapters and articles related to this topic
Technologies and Infrastructure in Internet of Things (IoT)
Published in Nishu Gupta, Srinivas Kiran Gottapu, Rakesh Nayak, Anil Kumar Gupta, Mohammad Derawi, Jayden Khakurel, Human-Machine Interaction and IoT Applications for a Smarter World, 2023
Sanjay Hosur, Vinay Ramakrishnaiah
Fog computing, also called edge computing, is intended for distributed computing where numerous devices connect to a cloud. The word “fog” is used to suggest a cloud's periphery or edge. The idea behind fog computing is to do as much processing as possible using computing units co-located with the data-generating devices, so that processed rather than raw data is forwarded, and bandwidth requirements are reduced. Also, the processed data is most likely needed by the same set of devices that generated them, therefore, by processing locally rather than remotely, the latency between the input and the response is minimized. This idea is similar to using special-purpose hardware/co-processors, such as digital signal processing units for performing fast Fourier transforms, which have long been used to reduce latency and burden on the central processing units (CPU) (Bonomi 2012).
Issues and Challenges Associated with Fog Computing in Healthcare 4.0
Published in Bhawana Rudra, Anshul Verma, Shekhar Verma, Bhanu Shrestha, Futuristic Research Trends and Applications of Internet of Things, 2022
Fog computing is the extension of the Internet of Things (IoT) and cloud computing where computing facilities have been extended toward the edge of the network. According to the U.S. National Institute of Standards and Technology, ‘Fog computing decentralizes management, data analytics, and applications within the network itself. This, in turn, minimizes the intervention time between requests and responses, providing local computing resource to devices and network connectivity to centralized services’ [24]. In recent years, fog computing has gained the attention of networking professionals because it decreases the load on the cloud. Inclusion of a fog node in-between sensor nodes reduces the load on the cloud as well as on the network. Any device having computing facilities and Internet connectivity can work as a fog node. Cisco has introduced the term ‘fog computing’ in 2012 to manage the high load on the cloud. Small boards like Arduino having computing facilities has increased the demand for fog computing. The fundamental purpose of fog computing is to reduce latency and bandwidth consumption by incorporating intelligence into the access point to optimize common IoT situations. Fog servers communicate with mobile users via off-shelf wireless interfaces, including Wi-Fi, mobile, and Bluetooth, via single-hop wireless connections. The fog server can provide cloud-like services to mobile users without the help of other fog servers or the remote cloud for predefined applications. On the other hand, the fog server can be connected to the cloud via the Internet to use rich cloud computing and content resources. Many protocols, like MQTT, CoAP, 6LowPAN, etc., have provided the basis for fog computing.
Communication Protocols in Fog Computing: A Survey and Challenges
Published in Ravi Tomar, Avita Katal, Susheela Dahiya, Niharika Singh, Tanupriya Choudhury, Fog Computing, 2023
In general, fog nodes are presently close to the sensors and extend the cloud computing. Fog computing is a distributed computing paradigm that acts as a link between cloud data centers and Internet of Things (IoT) devices. It enables cloud-based services to be provided closer to IoT devices/sensors by providing compute, networking, and storage capabilities (Katal et al., 2021). In 2012, Cisco introduced the concept of fog computing to address the constraints of IoT systems in standard cloud computing. Near the network’s edge, IoT devices/sensors, as well as real-time and latency-sensitive service requirements, are widely spread. Fog computing includes cloud-based application components along with the edge devices between cloud and the sensors. Fog computing provides mobility, computer resources, communication protocols, interface heterogeneity, and cloud integration to satisfy the needs of applications that are delay-sensitive across a large and dense geographical distribution. Since cloud data centres are widely dispersed, they typically fail to fulfil the memory and networking requirements of billions of globally scattered IoT devices/sensors. As a result, there is network congestion, excessive service delivery delay, and poor Quality of Service (QoS) (Sarkar & Misra, 2016). Fog computing is capable of constructing enormous geographical dispersion of cloud services due to its networking components. In addition, fog computing assists in location awareness, mobility aid, actual communications, flexibility, and compatibility (Bonomi et al., 2012). As a result, fog computing can be more efficient in terms of service delay, energy usage, network activity, and other factors (Sarkar et al., 2018).
Data Analysis of the Development Status of Basketball National Fitness Based on Fog Computing
Published in Applied Artificial Intelligence, 2023
With the popularity of mobile IoT devices, mobile cloud computing has become an emerging computing model for efficient management of limited resources. Fog computing is considered as a combination of wireless networks, mobile computing, and cloud computing that can provide rich computing resources to end devices. From the user’s point of view, especially when dealing with some computationally intensive applications and tasks, mobile cloud computing overcomes some limitations, such as battery life, computing power, and memory limitations. But mobile cloud computing also has several drawbacks, including low bandwidth, low security and privacy, low service availability, low network compatibility, and limited energy. Unlike fog computing, which sends task requests from end devices to nearby fog nodes for processing, mobile cloud computing computes and stores data from end devices on the cloud and does not process them locally.
A review on the impacts of connected vehicles on pavement management systems
Published in International Journal of Pavement Engineering, 2023
Mohammad Saleh Entezari, Amir Golroo
Bustamante-Bello et al. (2022) proposed a solution for pavement anomaly detection and classification using CVs, relying on V2I connectivity and a computing architecture known as fog computing. Fog computing is a platform which is capable of providing computation, storage, and networking services among devices and cloud data centres while offering many features including low latency, geographical distribution, wireless access, mobility, real-time applications, etc. In this article, first some of the different computation networks and wireless technologies utilised in intelligent transportation systems were reviewed. Then, a V2I-fog computing architecture was presented for pavement defect detection and classification which was comprised of four parts: Communications, RSU-OBU systems, Machine learning, and Data visualisation. Different from other research articles reviewed, in this project more effort had been put into the investigation and field tests of communication technologies and computation platforms rather than the data collection and model development. The suggested fog computing system had several steps. First, OBUs collected pavement data and store them. As soon as the vehicle reached a wireless internet Access Point (AP), the stored data were transmitted to the AP. The AP network was connected to the RSUs which took advantage of machine learning algorithms for preprocessing, analysing, and filtering the received data. The analysis results were then moved to a data bank and then sent to the final unit which was visualisation for decision making.
Digital twin deployment for smart agriculture in Cloud-Fog-Edge infrastructure
Published in International Journal of Parallel, Emergent and Distributed Systems, 2023
Yogeswaranathan Kalyani, Nestor Velasco Bermeo, Rem Collier
Cloud computing, Fog computing, and Edge computing are all computing models that offer different levels of decentralisation and proximity to end-users. Cloud computing refers to the delivery of computing resources over the internet from a centralised location [3]. Fog computing extends this model by bringing the cloud closer to the edge of the network, typically at the local network level, to provide faster response times and better quality of service [4]. Fog computing can also help reduce the amount of data that needs to be sent to the cloud, by processing and analysing data locally before sending it to the cloud. Edge computing takes this concept further by bringing the computing resources even closer to the end-users, typically at the device or sensor level [5]. This enables real-time processing and analysis of data, reducing the need for data to be sent to the cloud or fog nodes. Edge computing is particularly useful in applications such as autonomous vehicles, smart homes, and industrial automation, where low latency and high reliability are critical.