PUTEREA ANALITICĂ A SISTEMELOR FOG COMPUTING

logo_it_trendsAu trecut deja doi ani de cand vorbim despre Fog Computing. Puternica expansiune a sistemelor Internet of Things, deschide perspectiva unoi noi abordari pentru aceste tehnologii ce asigura puntea de legatura dintre IoT și Cloud. Iată un articol publicat în numărul din luna Decembrie 2015 al revistei IT Trends, bazat pe continutul specific despre IoT din Catalogul Cloud Computing – ediția speciala Iot- apărut in ultima săptămâna din 2015!

 

CCC$ CoverSe estimează că peste 90% din datele existente au fost create în ultimii doi ani… În fapt, în fiecare zi se creează alte 2,5 quintilioane de bytes de date și volumul crește exponențial. Potrivit unui studiu Cisco, proliferarea sistemelor IoT va face ca 40% din datele existente la nivelul anului 2020 sa provina din retele de achizitii de date bazate pe senzori. Astăzi un motor cu reacție poate genera 1 terabyte de date pe durata unui singur zbor, iar un retailer global poate acumula 2,5 petabytes de date provenite de la clienți în fiecare oră. În acest moment, 99,5% dintre aceste date nu au fost niciodată folosite și analizate…

Acesta este principalul motiv pentru construirea imperativă de sisteme analitice, care să poată folosi cât se poate de mult din această informație și a o returna sub formă de valoare operațională și agilitate de business.

În primul val al Internetului informația a putut fi livrată către analitice, dar folosirea datelor istorice nu a creat valoare decât într-un număr limitat de aplicații, precum estimările de exploatare a zăcămintelor de petrol sau procesarea statistică a datelor seismice pentru modelarea probabilistică a evenimentelor viitoare.

Astăzi, procesele definite mai general ca  Internet of Everything (IoE) permit rularea unei mari varietăți de aplicații ce necesită o nouă resursă intermediară de calcul, asociată cu termenul de Fog Computing. Prin extinderea funcționalităților Cloud computing către limitele rețelei, sistemele Fog Computing permit migrarea analiticelor chiar către sursele de date, facilitând procesarea acestora în timp real și generarea de acțiuni instantanee. În loc să risipească resurse prin transmiterea unor mari volume de date către centrele de procesare și analiză din Cloud, sistemele Fog Computing sortează și indexează datele local, trimițând către Cloud doar alertele sau situațiile de excepție. Pe scurt, totul funcționează ca un sistem de pre-Cloud local, de unde și analogia cu păturile de ceață care acoperă doar anumitele zone.

fogAnaliticele specifice Fog Computing necesită o arhitectură de rețea mai flexibilă, unde doar câteva elemente, precum politicile,  vor fi localizate în Cloud, cu funcții de procesare și transmitere în timp real către extremitățile rețelei. Datele care nu sunt atât de senzitive privind factorul timp pot fi trimise către Cloud pentru stocare pe termen lung și analize istorice.

Un alt imperativ al sistemelor Fog Computing este legat de standardizarea dispozitivelor și a interfețelor de date, integrarea cu modelele Cloud și construirea unei infrastructuri scalabile de administrare. O mare importanță este acordată dezvoltării viitoare a unor tehnologii precum vizualizarea datelor, ”streaming analytics” –  adaptare la fluxurile crescute de date sau ”machine learning” – îmbunătățirea performanței pe baza bunelor practici.

Cea mai comună aplicabilitate este cea a sistemelor inteligente de semaforizare. Prin aducerea analiticelor cât mai aproape de date senzorii din infrastructura de transport pot identifica poziționarea vehiculelor de intervenție (poliție, salvare, pompieri) și pot ajusta durata luminilor de trafic pentru a le ajuta să ajungă mai repede și mai sigur acolo unde este nevoie de ele. Nodurile de semafoare  pot interacționa în plan local cu alți senzori de teren, care pot detecta apropierea unui pieton sau a unui biciclist, măsurând distanța și viteza unor vehicule care se apropie de aceeași intersecție. Vehiculele pot primi la rândul lor mesaje de atenționare care pot acționa direct asupra frânei motorului în cazul unui pericol iminent.

O companie de petrol și gaze poate folosi senzorii acustici și de temperatură pentru a detecta condițiile anormale din teren și a preveni eventualele incidente. Un studiu recent arată că 37% dintre profesioniștii IT și OT se gândesc că în următorii trei ani majoritatea informațiilor vor fi analizate local, cu ajutorul unor dispozitive mobile conectate la limitele rețelelor. Alte aplicații practice pentru modelele Fog Computing sunt legate de proiecte Smart Grid, Smart Cities, educație, ecologie și sănătate.

Principlele avantaje ale modelului Fog Computing:

  • Datele sunt amplasate aproape de utilizatorul final – microcontrollerele permit păstrarea datelor în apropierea utilizatorilor, ceea ce e mai ușor decât stocarea lor într-un centru de date amplasat la mare distanță și elimină inerentele întârzieri în transferul de date.

  • Distribuție geografică – modelul Fog Computing extinde în mod natural serviciile de Cloud prin crearea unei rețele dense de ultimă generație cu numeroase noduri și distribuție geografică. Astfel accesul la Big Data și Analytics se realizează mult mai rapid, iar administratorii pot asigura suportul pentru cele mai diverse cerințe ale utilizatorilor mobili.

  • Asigură suport real pentru mobilitate și pregătirea pentru Internet of Everything – oricât de mare ar fi cererea de acces mobil și creșterea numărului de dispozitive mobile, sistemele Fog permit administrarea acestora indiferent de locație și modul de acces la informații.

  • Clienții sunt gata să le adopte –  un număr tot mai mare de furnizori de soluții de stocare și servicii distribuite de calcul au început să îmbrățișeze acest concept, pornind de la mai vechile modele de livrare de conținut și servicii complexe migrate în Cloud.

  • Integrare cu Cloud-ul și alte servicii – experiența Cloud poate fi îmbunătățită prin izolarea datelor utilizatorilor amplasați la extremitățile rețelei. De aici, administratorii pot livra servicii Data Analytics, Big Data, de securitate sau stocare direct pe un model Cloud pre-existent.

Advertisements

CATALOG CLOUD COMPUTING – MOBILE CLOUD

http://ittrends.ro/2015/03/catalog-cloud-computing-romania-2015-editia-a-3-a/

Un tandem tehnologic cu mare impact într-o lume mobilă, mobilă, mobilă…

Asadar am ajuns la a 3-a editie a Catalogului, rod al colaborarii dintre Agora Group si cloud☁mania. Pentru actuala editie am ales ca tema principala solutiile de mobilitate in cloud.

De ce Mobile Cloud Computing? Vedeti in articolul de deschidere al catalogului, pe care il reiau mai jos.

Ce ne ofera Mobile Cloud Computing? Cititi Catalogul Cloud Computing – editia a 3-a, Mobile Cloud Computing!

Triada Cloud – Mobilitate – Internet of Things domină topuriler tendințelor cu impact strategic în dezvoltarea tehnologică, dar mai ales în remodelarea proceselor de business. Interferențele rezultate din suprapunerea fiecăruia dintre aceste concepte generează noi modele precum Mobile Cloud Computing –  pentru joncțiunea dintre Cloud și Mobilitate sau Fog Computing, o veritabilă punte între Cloud și IoT.

Capture CCC3Mizând pe deprinderile noastre individuale de consumatori digitali care generează dependența de accesul mobil la la datele și aplicațiile de business, am ales ca temă a celei de-a treia ediții a catalogului Cloud Computing soluțiile Mobile Cloud Computing. Dualismul Mobilitate – Cloud este motorul accesului ”3O” la date și aplicații (Oricând-Oriunde-Orice dispozitiv).

Există Mobilitate fără Cloud?  este o întrebare simbolică,  devenită leit-motiv editorial al acestui Catalog. O contribuție valoroasă la acest exercițiu conceptual am primit-o din partea unor oameni cu mare experiență în IT, manageri, specialiști, evangeliști,  cărora le mulțumim pentru competența răspunsurilor.

Este piața românească pregătită pentru soluțiile Mobile Cloud?  Răspunsul nu poate fi decât pozitiv din moment ce telefonia mobilă, accesul mobil la Internet și aplicațiile SaaS se numără printre soluțiile cu cel mai mare impact la utilizatorii locali. Dar cele mai pertinente răspunsuri speram să le găsiți în materialele de prezentare ale partenerilor noștri de Catalog, furnizori cu o mare diversitate de oferte și mai ales cu o bogată experiență pe piața din România.

Multumim tuturor celor care au participat la aceasta initiativa!

Related articles:

Image Source: ittrends.ro

 

 

FOG COMPUTING – A NATURAL BRIDGE BETWEEN CLOUD AND IoT

“A new trend could be observed in joint relation with cloud” – it was the beginning of my December 2013 article, discussing about a new concept which behind exotic name founded the premises for distributed cloud models. In February 2015 we could describe Fog Computing as essential bridge between the infinite power of the Cloud and an almost infinite number of intelligent edge-points, conventionally conglomerated in IoT concept.

SF FogWhat’s happened with Fog Computing in this very short period? Cloud computing confirmed most of tech and market predictions, being involved in all major technology trends, from Big Data and Analytics, to Mobility, M2M, and Internet of Everything… In parallel with this, IoT concept gains an explosive development, having as main engine exponential increasing of data volumes provided by more and more population and large variety of intelligent devices.

Admitting some simplifying arguments, we could assimilate the Fog Computing transition from the primary role of “Cloud particularity” to the maturity stage governed by data gravity model, where applications are going to the data.  A large contribution in this evolution should be recognized to Cisco, IBM and other technology innovators which appreciated the strategic role of distributed data and focused on Fog Computing applicative research and strategic development platform. 

The needs for a more distributed model are coming from most obvious physical dispersion of data, coming in exponential increasing volume from more and more devices. Even processing and storage heaven offered by Cloud could not support huge amounts of data and could not deliver the large information content to users dispersed in geographical areas.

According various sources, Ginny Nichols from Cisco used for the first time the name “Fog Computing”, as a metaphoric association of the ground levels of cloud with concentrated computing processes from the edges of the networks. After that, the term “Fog Computing” has been adopted by Cisco Systems as a new paradigm to sustain wireless data transfer to distributed devices in the “Internet of Things.” Many other distributed computing and storage startups providers also embraced the new concept as extension of previous distributed computing models – like content delivery networks, but assuming delivery of more complex services using cloud technologies.

In January 2014 Cisco reveled his own fog computing vision, designed on the idea of bringing cloud computing capabilities to the edge of the network, much closer to the growing number of user devices that are consuming cloud services and generating the increasingly massive amount of data. Short time after, Cisco unveiled the company’s IOx platform, designed to bring distributed computing to the network edge.

According Cisco, Fog Computing extends the cloud computing paradigm to the edge of the network. While fog and cloud use the same computing, storage or network resources, and share many of the same mechanisms and attributes, like multi-tenancy and virtualization.

Now we could speak about a Fog Computing ecosystem based on Fog conceptually extension of Cloud computing – covering in a denser way wider geographic locations, and concentrations of Fog devices – much more heterogeneous in nature, ranging from end-user terminals, access points, to networks edge routers and switches. Provided data should be processed locally in smart devices rather than being sent for processing in the cloud. Fog computing is one approach to dealing with the demands of the ever-increasing number small connected devices, sometimes referred to as the Internet of Things (IoT).

In the IoT scenario, a thing is any natural or man-made object that can be assigned an IP address and provided with the ability to transfer data over a network. Some such things can create a lot of data. Mr. Michael Enescu  offered as example a jet engine, which can create 20 terabytes of engine performance data in one hour. Sending all data to the cloud and receiving the data back involve increasing demand for bandwidth, considerable amount of time, and induced latency. In a Fog Computing environment, a big part of local data processing would take place in a router, rather than having to be transmitted.

Resuming, the main characteristics of the Fog are:

  1. Low latency and location awareness;
  2. Wide-spread geographical distribution;
  3. Wide area and real time Mobility access;
  4. Increasing number and diversity of nodes;
  5. Essential access to wireless;
  6. Strong presence of streaming and real time applications,
  7. Heterogeneity of devices and data sources.

The most illustrative example of Fog computing models in real success stories for IoT are related to projects for Connected Vehicles, Smart Grid, Smart Cities, Education, Ecology, or Health Care.  As described in the Cisco research group article  the Connected Vehicle deployment could displays various connectivity scenarios: cars to cars, cars to access points (Wi-Fi, 3G, LTE, roadside units [RSUs], smart traffic lights), and access points to access points. The Fog has a number of attributes that make it the ideal platform to deliver a rich menu of SCV services in infotainment, safety, traffic support, and analytics: geo-distribution (throughout cities and along roads), mobility and location awareness, low latency, heterogeneity, and support for real-time interactions.

A smart traffic light system is based on smart traffic light nodes, which interacts locally with a number of terrain sensors which are detecting the presence of pedestrians and bikers, and measures the distance and speed of approaching vehicles. It also interacts with neighboring lights to coordinate the green traffic wave. Based on this information the smart light sends warning signals to approaching vehicles, and even modifies its own cycle to prevent accidents.

Which is the future of Fog Computing? According IBM, as sensors move into more things and data grows at an enormous rate, a new approach to hosting the applications will be needed. Fog computing, which could inventively utilize existing devices, could be the right approach to hosting an important new set of applications.

Could be a paradox, but the movement process to the edge will not decrease the importance of the core. On the contrary, it means that the data center needs to be a stronger nucleus for expanding computing architecture. One argument sustained by Information Week is the server sales market was not affected by the Cloud expansion, Hybrid computing models, Big Data and IoT have contributed to server market development as direct consequence of data volume and data processing increasing request.

Every day more and more devices are connected to the Internet. Until 2020 IoT will bring more than 31 Billion of connected terminals – according Intel,  or 50 Billion of smart devices in a more optimistic prediction from Cisco. But if expansion of intelligent devices could be more or less predicted on short or middle time, the extraordinary increasing of data volume is hard to be estimated.

Until smarter analytics and research investigation tools will be invented, let’s think to these figures: IoT traffic will grow with 82% until 2017 and more than 90% of actual data was created in the last two years…

 

Related article: FOG COMPUTING, A NEW STAR IN CLOUD UNIVERSE, Radu Crahmaliuc, cloud☁mania, December 2013

Image Source: “San Francisco Fog”, flickr.com

FOG COMPUTING, A NEW STAR IN CLOUD UNIVERSE:

 

Fog articleA new trend could be observed in joint relation with cloud starting last summer. Fog Computing concept aims to gather services, workloads, applications and large data volumes in the same place and to deliver it all to the edge of the new generation network. The main goal is to offer core data, computing power, storage, memory and application services at a really distributed level.

The roots of the new concept are coming from the fact today’s data are extremely dispersed and the main needs are related to a continuously delivery, in large volumes and to a huge number of users with different devices. Before designing an effective cloud model, businesses needs to learn how to deliver the large content volume to users in a geographically distributed platform.

The idea of Fog Computing is to distribute all data and place it closer to a user, removing network delays and other possible obstacles related to data transfer. Users need to have all data and apps at any time in any place. What is new here? This is the essence of cloud services. But Fog Computing could take this service to the next level.

The new Fog Computing concept is based on the abstract concept of the “Drop”, a micro-controller chip with built-in memory and data transfer interface, which combined with wireless connection Mesh chip can connect different sensors for temperature, light, voltage, etc. in a really distributed network of data or devices, located all around the world.

fogThe key advantages of Fog Computing are related to:

  • Data placed closer to the final user – the Drops allow keeping the data close to a user instead of storing them in a far data center and eliminate possible delays in data transfer.
  • Creating dense geographical distribution – Fog computing extends direct cloud services by creating an edge network which sits at numerous points. This, dense, geographically dispersed infrastructure helps in numerous ways: big data and analytics can be done faster, administrators are able to support location-based mobility demands and not have to traverse the entire WAN, and finally, this edge (Fog) systems would be created in such a way that real-time data analytics become a reality on a truly massive scale.
  • True support for mobility and the IoE – there is a direct increase in the amount of devices and data that we use. Administrators are able to leverage the Fog and control where users are coming in and how they access this information. As more services are created to benefit the end-user, edge and Fog networks will become more prevalent.
  • Numerous verticals are ready to adopt – The term “fog computing” has been embraced by Cisco Systems as a new paradigm to support wireless data transfer to support distributed devices in the “Internet of Things.” A number of distributed computing and storage start-ups are also adopting the phrase. It builds upon earlier concepts in distributed computing, such as content delivery networks, but allows the delivery of more complex services using cloud technologies.
  • Seamless integration with the cloud and other services – With Fog services is possible to enhance the cloud experience by isolating user data that needs to live on the edge. From there, administrators are able to tie-in analytics, security, or other services directly into their cloud model. This infrastructure still maintains the concept of the cloud while incorporating the power of Fog Computing at the edge.

Concluding, Fog Computing is not a replacement for Cloud Computing. Fog Computing is a big step to a distributed cloud – by controlling data in all node points, fog computing allows turning datacenter into a distributed cloud platform for users. Fog is an addition which develops the concept of cloud services. Thanks to the “drops” it is possible to isolate data in the cloud systems and keep them close to users.

Market figures are showing very clear that IT consumerization and BYOD will increase mobile data consumption. Users are more and more mobile. We need mobility to conduct our business and to better organise our personal lives. Rich content and lots of data points are pushing cloud computing platforms, literally, to the edge – where our more and more hungry requirements are claiming more services and applications. As more services, data and applications are pushed to us, technologists will need to find ways to optimise the delivery process.

This means bringing information closer to the end-user will reduce latency and will request us to be prepared for the Internet of Everything.

Sources:

Photo Source: 1bog.org

See also the new article: “FOG COMPUTING – A NATURAL BRIDGE BETWEEN CLOUD AND IoT”, 9th of February 2015

%d bloggers like this: