Fog Computing Architecture: Survey and Challenges

11/22/2018
by   Ranesh Kumar Naha, et al.
0

Emerging technologies that generate a huge amount of data such as the Internet of Things (IoT) services need latency aware computing platforms to support time-critical applications. Due to the on-demand services and scalability features of cloud computing, Big Data application processing is done in the cloud infrastructure. Managing Big Data applications exclusively in the cloud is not an efficient solution for latency-sensitive applications related to smart transportation systems, healthcare solutions, emergency response systems and content delivery applications. Thus, the Fog computing paradigm that allows applications to perform computing operations in-between the cloud and the end devices has emerged. In Fog architecture, IoT devices and sensors are connected to the Fog devices which are located in close proximity to the users and it is also responsible for intermediate computation and storage. Most computations will be done on the edge by eliminating full dependencies on the cloud resources. In this chapter, we investigate and survey Fog computing architectures which have been proposed over the past few years. Moreover, we study the requirements of IoT applications and platforms, and the limitations faced by cloud systems when executing IoT applications. Finally, we review current research works that particularly focus on Big Data application execution on Fog and address several open challenges as well as future research directions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset