“A new trend could be observed in joint relation with cloud” – it was the beginning of my December 2013 article, discussing about a new concept which behind exotic name founded the premises for distributed cloud models. In February 2015 we could describe Fog Computing as essential bridge between the infinite power of the Cloud and an almost infinite number of intelligent edge-points, conventionally conglomerated in IoT concept.

SF FogWhat’s happened with Fog Computing in this very short period? Cloud computing confirmed most of tech and market predictions, being involved in all major technology trends, from Big Data and Analytics, to Mobility, M2M, and Internet of Everything… In parallel with this, IoT concept gains an explosive development, having as main engine exponential increasing of data volumes provided by more and more population and large variety of intelligent devices.

Admitting some simplifying arguments, we could assimilate the Fog Computing transition from the primary role of “Cloud particularity” to the maturity stage governed by data gravity model, where applications are going to the data.  A large contribution in this evolution should be recognized to Cisco, IBM and other technology innovators which appreciated the strategic role of distributed data and focused on Fog Computing applicative research and strategic development platform. 

The needs for a more distributed model are coming from most obvious physical dispersion of data, coming in exponential increasing volume from more and more devices. Even processing and storage heaven offered by Cloud could not support huge amounts of data and could not deliver the large information content to users dispersed in geographical areas.

According various sources, Ginny Nichols from Cisco used for the first time the name “Fog Computing”, as a metaphoric association of the ground levels of cloud with concentrated computing processes from the edges of the networks. After that, the term “Fog Computing” has been adopted by Cisco Systems as a new paradigm to sustain wireless data transfer to distributed devices in the “Internet of Things.” Many other distributed computing and storage startups providers also embraced the new concept as extension of previous distributed computing models – like content delivery networks, but assuming delivery of more complex services using cloud technologies.

In January 2014 Cisco reveled his own fog computing vision, designed on the idea of bringing cloud computing capabilities to the edge of the network, much closer to the growing number of user devices that are consuming cloud services and generating the increasingly massive amount of data. Short time after, Cisco unveiled the company’s IOx platform, designed to bring distributed computing to the network edge.

According Cisco, Fog Computing extends the cloud computing paradigm to the edge of the network. While fog and cloud use the same computing, storage or network resources, and share many of the same mechanisms and attributes, like multi-tenancy and virtualization.

Now we could speak about a Fog Computing ecosystem based on Fog conceptually extension of Cloud computing – covering in a denser way wider geographic locations, and concentrations of Fog devices – much more heterogeneous in nature, ranging from end-user terminals, access points, to networks edge routers and switches. Provided data should be processed locally in smart devices rather than being sent for processing in the cloud. Fog computing is one approach to dealing with the demands of the ever-increasing number small connected devices, sometimes referred to as the Internet of Things (IoT).

In the IoT scenario, a thing is any natural or man-made object that can be assigned an IP address and provided with the ability to transfer data over a network. Some such things can create a lot of data. Mr. Michael Enescu  offered as example a jet engine, which can create 20 terabytes of engine performance data in one hour. Sending all data to the cloud and receiving the data back involve increasing demand for bandwidth, considerable amount of time, and induced latency. In a Fog Computing environment, a big part of local data processing would take place in a router, rather than having to be transmitted.

Resuming, the main characteristics of the Fog are:

  1. Low latency and location awareness;
  2. Wide-spread geographical distribution;
  3. Wide area and real time Mobility access;
  4. Increasing number and diversity of nodes;
  5. Essential access to wireless;
  6. Strong presence of streaming and real time applications,
  7. Heterogeneity of devices and data sources.

The most illustrative example of Fog computing models in real success stories for IoT are related to projects for Connected Vehicles, Smart Grid, Smart Cities, Education, Ecology, or Health Care.  As described in the Cisco research group article  the Connected Vehicle deployment could displays various connectivity scenarios: cars to cars, cars to access points (Wi-Fi, 3G, LTE, roadside units [RSUs], smart traffic lights), and access points to access points. The Fog has a number of attributes that make it the ideal platform to deliver a rich menu of SCV services in infotainment, safety, traffic support, and analytics: geo-distribution (throughout cities and along roads), mobility and location awareness, low latency, heterogeneity, and support for real-time interactions.

A smart traffic light system is based on smart traffic light nodes, which interacts locally with a number of terrain sensors which are detecting the presence of pedestrians and bikers, and measures the distance and speed of approaching vehicles. It also interacts with neighboring lights to coordinate the green traffic wave. Based on this information the smart light sends warning signals to approaching vehicles, and even modifies its own cycle to prevent accidents.

Which is the future of Fog Computing? According IBM, as sensors move into more things and data grows at an enormous rate, a new approach to hosting the applications will be needed. Fog computing, which could inventively utilize existing devices, could be the right approach to hosting an important new set of applications.

Could be a paradox, but the movement process to the edge will not decrease the importance of the core. On the contrary, it means that the data center needs to be a stronger nucleus for expanding computing architecture. One argument sustained by Information Week is the server sales market was not affected by the Cloud expansion, Hybrid computing models, Big Data and IoT have contributed to server market development as direct consequence of data volume and data processing increasing request.

Every day more and more devices are connected to the Internet. Until 2020 IoT will bring more than 31 Billion of connected terminals – according Intel,  or 50 Billion of smart devices in a more optimistic prediction from Cisco. But if expansion of intelligent devices could be more or less predicted on short or middle time, the extraordinary increasing of data volume is hard to be estimated.

Until smarter analytics and research investigation tools will be invented, let’s think to these figures: IoT traffic will grow with 82% until 2017 and more than 90% of actual data was created in the last two years…


Related article: FOG COMPUTING, A NEW STAR IN CLOUD UNIVERSE, Radu Crahmaliuc, cloud☁mania, December 2013

Image Source: “San Francisco Fog”,


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.