Site Loader
Rock Street, San Francisco


The idea of Fog computing  or Edge Computing
is to extend the cloud nearer to the Internet
of Things ( IOT )  devices. Through
the internet of things ( IOT ) , we are generating a numerous volume &
variety of data. With the increase in the number of internet connected devices,
the increased demand of real-time, low-latency services,  data security are the biggest challenges for
the traditional ( Cloud ) computing framework. So to overcome these challenges Fog
Computing came into picture. The  computing
reduces the predictable latency in the latency-sensitivity of Internet of
Things ( IoT ) applications such as healthcare services which is primary
objective of this computing. The paper proves that the Fog Computing is the
best platform solutions towards this goal for a number of critical Internet of
Things (IoT) services and applications like  connected vehicle, smart grid , smart cities,
and in wireless sensors and actuators networks (WSANs).

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!

order now

Keywords : Fog Computing, IoT, Cloud Computing, WASN, latency


Fog computing
is the term coined by CISCO. The computing lies at an intermediate layer
between the cloud & IOT devices shown in Fig1.

Fig. 1 : Fog nodes between edge & cloud

Fog computing’
is also termed as ‘edge computing’, which essentially  means that some what than hosting &
working from a centralized cloud. Fog systems operates on network ends. That
concentration referred that can be processed locally in smart devices rather
being send to the cloud processing. As IoT is emerging, number of sensors have
been employed in various devices which  combines information & computing processes
to control very large collection of different objects. 1 which are rapidly
leading to an increasing amount of data generation. Cloud computing is
providing ‘pay as you go model’ which is an efficient way to have data centres
rather than having private data centres for customers for having their Web
applications, batch processing and other processes. Therefore, the cloud
computing reduce the burden of specifications and details for an enterprise. Since
devices which has computing capacity estimate the environment by interacting
with networking devices and basic property of network connectivity. The network
ensures that all devices can share information with each other over the single
or multiple hops. As shown in Figure 1 each smart thing is linked to one of Fog
nodes. Fog nodes could be linked and each of them is linked to the Cloud.




However, the collected task of the current
cloud model is insufficient to handle the requirements of IoT,

Due to following issues ;

Ø  Volumeof the data that are produced
by the IoT devices.

Ø  TheLatencythat means the time that it takes for a sense data to go to
the cloud  & sent back commands.

Ø  Bandwidth , which means that how much channels
are going to occupied because of this communication of all these data of IoT






The traditional computing
architecture look like as follows;



Sends data
send back

& storage                           
or action require

Fig.2 : present day cloud model 

So, we have all the Iot devices & we have the cloud(traditionally),
we have to use this IoT devices to sense the physical phenomena occurring
around them, which sends the data to the cloud & get an action or command
book or action required. & to reduce this particular time we need FOG
COMPUTING & typically these clouds are versed might be physically located
even a continents away from the users, so it is difficult to have them in
different cities so on but a
continents  away, this physical
limitation also introduces a large latency
in communication. In terms of volume, it
is estimated that by 2020,about 50
billion IoT devices will be online.  Presently
billions of devices produce Exabyte of data everyday. & this is a very big
data. So, so much of data unusual amount of data are going to be produce
because of the introduction of IoT . So the device density is still increasing
everyday. & the current cloud model is unable to process this amount of data.
The private firms, factories, airplanes companies produces huge amount of data
every day. So all of these data would have to send to the cloud for further
storage . So the current cloud model cannot basically store all
of these data . So this data i.e; produce their raw form of data has to be
filtered before the data is sent to the cloud. This has to pick process to filter
before it is distinct for storing & processing on the cloud. In terms of latency, lot of time has been taken by a data packet for a
round trip. On important aspect for handling a time sensitive data, is
basically is to handle the issues of latency, because if it is time sensitive,
it is a real time data. So time is important & that is the reason why the
latency has to be handle with special internists. If the edge devices send time
sensitive data toc loud for analysis & wait for the cloud to give a proper
action, then it can lead to many unwanted results. So while handling time sensitive
data, a millisecond can make huge differences.

figure 2, these edge devices or healthcare services or vehicles are generating
the time-sensitive data in nature. So this is the reason why the process has to
be fast to be able to use the stored data in a meaningful manner. So, they have
to be processed very fast, so Ambulance or any medical healthcare services will
generate some data so,the time that it takes the data for the data to go from
the edge devices to the cloud & come back.                    This can be shown in the
form of this following Equation ;                                                          
                                                       LATENCY = Tfrom device to cloud  + Tdata analysis                       +                                      +Tfrom cloud to  device        

where T = Total Time

So, Latency
will be increased. When the action reaches to the device, Accident may occurred
if it is an emergency situation. Hence the ‘fog computing’ is very important
for the IoT.

                         In terms of Bandwidth, Bandwidth is Bit-rate of
data during transmission. If all the data generated by Internet of Things (IoT)devices
are sent to cloud for storage & analysis, then the traffic generated by
these devices will be simply gigantic. So, these IoT devices are going to
consumes almost all the bandwidth because of this & handling this kind of
traffic will be simply a very hard task. So, billions of devices  consuming bandwidth. If all the devices
become online even IPv6 will not able to provide facility to all the devices
& the data may be confidential which firms do not want to share online. So,
these are the difference problems.

One  is the privacy
of the data ,this is off concern to the firms the Second isthat dealing with theses kind of
IP based technology like IPv6 is a problem & also the issue of billions of
devices consuming bandwidth. So, how
do we handle them together in a synergetic manner .


Requirements of Internet of Things ( IoT):

  A . Reduce Latency of data

action at right time prevents major accidents machines failure etc. so a minute
delay while taking a decisions make a huge difference & latency can be
reduced by analysing the data close to the data source.

B . Data Security& Privacy  ;

Internet of
Things ( IoT ) data must be secured & protected from intruders. Data are
required to be monitored 24 X 7 so that an appropriate action should be taken
before the attack causes major harm to the network.

C . Operation reliability ;

The data
generated from IoT devices are used to solve real time problem, The problem of
integrity & availability of the data must be guaranteed &
unavailability & tempering of data can be hazardous.

D . Processing of data at respective suitability place ;

Data can be
divided into three types based on sensitivity

Ø  Time sensitive data

Ø  Less sensitive data

Ø  Data which are not time sensitive

So, this kind
of filtering with respect the sensitivity of the data. Extremely time sensitive
data should be analysed very near to the data source the data which are not time sensitive will be
analysed in the cloud. So time sensitive data should be closer to the IoT
devices & Non-Sensitive data send it to the cloud.

E . Monitor data across large geographical area ;

The location
of connected IoT devices can be spread across a large geographical region.

For example :Monitoring
the Railway track of country or state& the devices are exposed to the harsh
environments condition.

Fig.3 : Fog computing processing module

Characteristics of  Fog 
Computing :

a) Low latency and location awareness.

b) Wide-spread geographical distribution.

c) Mobility.

d) Very large number of nodes.

Predominant role of wireless access.

f) Strong presence of streaming and real
time applications.

g) Heterogeneity.



Fog Computing & Internet of
Things( IoT):

this section, i demonstrate the role the Fog plays in three scenarios of
interest: Connected Vehicle, Smart Grid, and Wireless Sensor and Actuator
Networks (WSAN).

Vehicle (CV) ;

The Connected
Vehicle deployment displays a rich scenario of connectivity and interactions:
cars to cars, cars to access points (Wi-Fi, 3G, LTE, roadside units RSUs,
smart traffic lights), and access points to access points. The Fog has a number
of attributes that make it the ideal platform to deliver a rich menu of SCV
services in infotainment, safety, traffic support, and analytics:
geo-distribution (throughout cities and along roads), mobility and location
awareness, low latency, heterogeneity, and support for realtime interactions. A
smart traffic light illustrates the latter. The smart traffic light node
interacts locally with a number of sensors, which detect the presence of
pedestrians and bikers, and measures the distance and speedof approaching
vehicles. It also interacts with neighbouring lights to coordinate the green
traffic wave. Based on this information the smart light sends warning
signals to approaching
vehicles, networks. The information flow is not unidirectionaland even modifies
its own cycle to prevent accidents. Re-coordinating with neighbouring STLs
through the orchestration layer of the Fog follows any modification of the
cycle. The data collected by the STLs is processed to do real-time analytics
(changing, for instance, the timing of the cycles in response to the traffic
conditions). The data from clusters of smart traffic lights is sent to the
Cloud for global, longterm analytics.

b. Smart

Smart Grid is
another rich Fog use case. which data hierarchies help illustrate further this
interplay. Fog collectors at the edge ingest the data generated by grid sensors
and devices. Some of this data relates to protection and control loops that
require real-time processing (from milliseconds to sub seconds). This first
tier of the Fog, designed for machine-to-machine (M2M) interaction, collects,
process the data, and issuescontrol commands to the actuators. It also filters
the data to be consumed locally, and sends the rest to the higher tiers. The
second and third tier deal with visualization and reporting (human-to-machine
HMI interactions), as well as systems and processes (M2M). The time scales of
these interactions, all part of the Fog, range from seconds to minutes
(real-time analytics), and even days (transactional analytics). As a result of
this the Fog must support several types of storage, from ephemeral at the
lowest tier to semi-permanent at the highest tier. We also note that the higher
the tier, the wider the geographical coverage, and the longer the time scale.
The ultimate, global coverage is provided by the Cloud, which is used as
repository for data that that has a permanence of months and years, and which
is the bases for business intelligence analytics. This is the typical HMI
environment of reports and dashboards the display key performance indicators.

Fig.4 : Tiers of Fog Computing for IoT

Wireless Sensors and Actuators Networks ;

The original
Wireless Sensor Nodes (WSNs), nicknamed motes 2, were designed to operate at
extremely low power to extend battery life or even to make energy harvesting
feasible. Most of these WSNs involve a large number of low bandwidth, low
energy, low processing power, small memory motes, operating as sources of a
sink (collector), in a unidirectional fashion. Sensing the environment, simple
processing, and forwarding data to the static sink are the duties of this class
of sensor networks, for which the open source TinyOS2 is the de-facto standard
operating system. Motes have provenuseful in a variety of scenarios to collect
environmental data (humidity, temperature, amount of rainfall, light intensity,
etc). Energy constrained WSNs advanced in several directions: multiple sinks,
mobile sinks, multiple mobile sinks, and mobile sensors were proposed in
successive incarnations to meet the requirements of new applications. Yet, they
fall short in applications that go beyond sensing and tracking, but require
actuators to exert physical actions (open, close, move, focus, target, even
carry and deploy sensors). Actuators, which can control either a system or the
measurement process itself, bring new dimensions to sensor networks. The
information flow is not unidirectional (from the sensors to the sink),

bi-directional (sensors to sink, and controller node to actuators). In a
subtler, but significant way, it becomes a closed-loop system, in which the
issues ofstability and potential oscillatory behaviour cannot be ignored.
Latency and jitter become a dominant concern in systems that require rapid
response. S.S. Kashi and M. Sharifi 3 survey the contributions in the
coordination of Wireless Sensor and Actuator Networks (WSANs). They point out
that in onearchitectural choice, the WSAN consists of two networks: a wireless
sensor network and a mobile ad hoc network (MANET). T. Banka et al 4 stress
that emergent applications demand a higher bandwidth, collaborative sensing
environment. Their experience is rooted in the CASA (Collaborative Adaptive
Sensing of the Atmosphere) project. CASA 5,a multi-year, multi-partner
initiative led by UMASS, deployed a networkof small weather radars, integrated
with a distributed processing and storage infrastructure
in a closedloop system to monitor the lower troposphere for atmospheric hazards
like tornados, hailstorms, etc. Zink et al 6 provide technical details of the
deployment. The characteristics of the Fog (proximity and location awareness,
geo-distribution, hierarchical organization) make it the suitable platform to
support both energy-constrained WSNs and WSANs.




Design :

a. System architecture:




Post Author: admin


I'm Eunice!

Would you like to get a custom essay? How about receiving a customized one?

Check it out