Fog computing, the Cisco-initiated edge intelligence approach extending cloud to the edge of a network, which is further developed in the OpenFog Consortium, entered the stage of standardization end of 2017. An update as a first unapproved draft is available and more results are expected any time now.
OpenFog Reference Architecture encompasses various approaches to disperse Information Technology (IT), Communication Technology (CT) and Operational Technology (OT) Services through information messaging infrastructure as well as legacy and emerging multi-access networking technologies (IEEE Draft Standard for Adoption of OpenFog Reference Architecture for Fog Computing)
Mainly seen in the context of IoT networks and Industrial IoT, Industry 4.0 or advanced IoT as the OpenFog Consortium calls it (with also 5G and artificial intelligence in the equation and its aim to ‘bridge the cloud to things continuum’) fog computing has evolved a lot since the OpenFog Consortium was launched end 2015. This is certainly also the case from the perspective of organizations having joined that consortium.
Striving towards an open architecture for fog computing, the consortium had its OpenFog Reference Architecture used as the basis for a new IEEE working group, created end 2017 and working towards fog computing and networking standards. Both fog computing and edge computing obviously are on the rise as ever larger IoT projects with ever more devices (at the edge), data and different flavors of artificial intelligence added to the picture to make sense of it all faster and better with the necessary prioritizations. With 5G in IoT, with 5G far from standardized, let alone a fact, the loop of fog computing is more or less closed.
What You Will Learn
The need for standardization against the backdrop of fast fog and edge evolutions
However, even if edge computing and fog computing are certainly going to increasingly become essential in IoT and reach that stage faster in the industries and use cases where fog and edge make sense, standards, interoperability and the development of devices and of mature IoT players that are fit and smart enough for fog is essential.
Yet, here too things change fast and the IoT platform market, to give one important example, sees some established fog players. In fact, there was quite a bit of fog computing at Hannover Messe 2018, also in the scope of edge platforms, particularly IIoT platforms with edge capabilities of course.
The massive amounts of data produced, transported, analyzed and acted upon within industries such as transportation, healthcare, manufacturing and energy—collectively measured in zettabytes—is exposing challenges in cloud-only architectures and operations that reside only at the edge of the network. Fog computing works in conjunction with the cloud and across siloed operations to effectively enable end-to-end IoT, 5G and AI scenarios
We noted the presence of leading IIoT edge fog computing platform provider FogHorn Systems (recently called “a key innovator in real edge computing and complex event processing” in IoT manufacturing platforms by ABI Research and present in Hannover with Intel and Google), fellow OpenFog Consortium industrial IoT platform relayr (rated in MachNation’s IoT Application Enablement Platform scorecard and, along with Foghorn ranked in MachNation’s IoT edge scorecard), their colleague, Gartner Cool Vendor 2017 and fog computing pioneer Flavio Bonomi’s Nebbiolo Technologies (that announced an Industrial IoT solution partnership with Toshiba Digital Solutions Corporation at the fair), Austrian TTTech (in which Samsung invested in September 2017 as part of its automotive technology investment fund) and Molex which had also signed an agreement before the event and with TTTech showing edge computing devices that are part of its Nerve platform which integrates fogOS and fogSM software from Nebbiolo, Microsoft and a bunch of other smaller and bigger vendors which offer fog solutions and are members of the OpenFog Consortium. Last but not least there was of course also Cisco.
It’s clear: with the increasing importance of both fog and edge the time for standardization in order to move to the next stage and broaden the ecosystem of supporting vendors, AI specialists and more had come.
As IT spend on edge infrastructure is expected to reach up to 18% of the total spend on IoT infrastructure by 2020 according to the “IDC FutureScape: Worldwide IoT 2018 Predictions“, which we briefly tackled in a post on Internet of Things spending 2018, those fog standards are now coming (for IDC fog computing and edge computing are overlapping as they look from the IoT use case perspective).
With IoT in manufacturing being the main spending area in IoT overall and the types of use cases in manufacturing – as IT and OT converge – fitting on the scope of fog/edge it won’t come as a surprise that also from the edge intelligence spending perspective manufacturing leads as IDC already said in 2017.
The time to mainstream of edge execution in manufacturing is pretty fast according to IDC’s Worldwide Operations Technology 2018 predictions, yet the size of the bubble that shows the complexity/cost to address is pretty large too. Nevertheless, according to the research firm by 2019, 10% of manufacturers will have started to deploy cloud-based execution models that depend on edge analytics. More about the view of IDC on IoT infrastructure, composed of core, cloud, edge and fog in an IDC 2017 IoT infrastructure report.
Progress on the IEEE fog computing standards work as the narrative around fog shifts
Back to the IEEE standardization. The first meeting of the IEEE Standards Working Group on Fog Computing and Networking Architecture Framework took place end 2017 and results should be available roundabout the time you read this – or a bit later.
No doubt the second edition of the Fog World Congress 2018 early October 2018 in San Francisco will report on progress of the OpenFog Reference Architecture, which was described at the occasion of the IEEE announcement as “a universal technical framework designed to enable the data-intensive requirements of IoT, 5G and AI applications and a structural and functional prescription of an open, interoperable, horizontal system architecture for distributing computing, storage, control and networking functions closer to the users along a cloud-to-thing continuum”.
OpenFog Consortium chairman Helder Antunes (Cisco) said that the mandate for fog computing is growing stronger, driven by the recognition that traditional architectures can’t deliver on the operational challenges for today’s advanced digital applications while John Zao, Chair, IEEE Standards Working Group on Fog Computing and Networking Architecture Framework, said that the IEEE and OpenFog Consortium collaboration was a giant step forward for fog computing and for the industry, which wil soon have the specifications for use in developing industrial strength fog-based hardware, software and services.
You can follow the work of the IEEE working group here and buy a first “IEEE Draft Standard for Adoption of OpenFog Reference Architecture for Fog Computing”, released early 2018, fully known as P1934/D1.0, Feb 2018, unapproved draft, here to see where the fog computing standardization efforts are heading. More later in the Spring of 2018 and/or at the October 2018 Fog World Congress.
At the occasion of the announcement of that second Fog World Congress in April 2018 Lynne Canavan, co-chair of Fog World Congress and vice president of marketing at the OpenFog Consortium stated that while the first event was a lot about what fog is and why it is even needed the conversation ‘has shifted to fog is now moving innovation from the lab to the real world, from package-delivering drones to factory-floor robots, and from privacy in connected cities to robotic surgery, we’re hearing the mantra this is fog growing louder.”
Time for those standards.
Top image: Shutterstock – Copyright: BeeBright – All other images are the property of their respective mentioned owners.